The Selfish Net - The Semantic Web

By: Sam Vaknin

A decade after the invention of the World Wide Web, Tim Berners-Lee is promoting the "Semantic Web". The Internet hitherto is a repository of digital content. It has a rudimentary inventory system and very crude data location services. As a sad result, most of the content is invisible and inaccessible. Moreover, the Internet manipulates strings of symbols, not logical or semantic propositions. In other words, the Net compares values but does not know the meaning of the values it thus manipulates. It is unable to interpret strings, to infer new facts, to deduce, induce, derive, or otherwise comprehend what it is doing. In short, it does not understand language. Run an ambiguous term by any search engine and these shortcomings become painfully evident. This lack of understanding of the semantic foundations of its raw material (data, information) prevent applications and databases from sharing resources and feeding each other. The Internet is discrete, not continuous. It resembles an archipelago, with users hopping from island to island in a frantic search for relevancy.

Even visionaries like Berners-Lee do not contemplate an "intelligent Web". They are simply proposing to let users, content creators, and web developers assign descriptive meta- tags ("name of hotel") to fields, or to strings of symbols ("Hilton"). These meta-tags (arranged in semantic and relational "ontologies" - lists of metatags, their meanings and how they relate to each other) will be read by various applications and allow them to process the associated strings of symbols correctly (place the word "Hilton" in your address book under "hotels"). This will make information retrieval more efficient and reliable and the information retrieved is bound to be more relevant and amenable to higher level processing (statistics, the development of heuristic rules, etc.). The shift is from HTML (whose tags are concerned with visual appearances and content indexing) to languages such as the DARPA Agent Markup Language, OIL (Ontology Inference Layer or Ontology Interchange Language), or even XML (whose tags are concerned with content taxonomy, document structure, and semantics). This would bring the Internet closer to the classic library card catalogue.

Even in its current, pre-semantic, hyperlink-dependent, phase, the Internet brings to mind Richard Dawkins' seminal work "The Selfish Gene" (OUP, 1976). This would be doubly true for the Semantic Web.

Dawkins suggested to generalize the principle of natural selection to a law of the survival of the stable. "A stable thing is a collection of atoms which is permanent enough or common enough to deserve a name". He then proceeded to describe the emergence of "Replicators" - molecules which created copies of themselves. The Replicators that survived in the competition for scarce raw materials were characterized by high longevity, fecundity, and copying-fidelity. Replicators (now known as "genes") constructed "survival machines" (organisms) to shield them from the vagaries of an ever- harsher environment.

This is very reminiscent of the Internet. The "stable things" are HTML coded web pages. They are replicators - they create copies of themselves every time their "web address" (URL) is clicked. The HTML coding of a web page can be thought of as "genetic material". It contains all the information needed to reproduce the page. And, exactly as in nature, the higher the longevity, fecundity (measured in links to the web page from other web sites), and copying-fidelity of the HTML code - the higher its chances to survive (as a web page). Replicator molecules (DNA) and replicator HTML have one thing in common - they are both packaged information. In the appropriate context (the right biochemical "soup" in the case of DNA, the right software application in the case of HTML code) - this information generates a "survival machine" (organism, or a web page).

The Semantic Web will only increase the longevity, fecundity, and copying-fidelity or the underlying code (in this case, OIL or XML instead of HTML). By facilitating many more interactions with many other web pages and databases - the underlying "replicator" code will ensure the "survival" of "its" web page (=its survival machine). In this analogy, the web page's "DNA" (its OIL or XML code) contains "single genes" (semantic meta-tags). The whole process of life is the unfolding of a kind of Semantic Web.

In a prophetic paragraph, Dawkins described the Internet: "The first thing to grasp about a modern replicator is that it is highly gregarious. A survival machine is a vehicle containing not just one gene but many thousands. The manufacture of a body is a cooperative venture of such intricacy that it is almost impossible to disentangle the contribution of one gene from that of another. A given gene will have many different effects on quite different parts of the body. A given part of the body will be influenced by many genes and the effect of any one gene depends on interaction with many others...In terms of the analogy, any given page of the plans makes reference to many different parts of the building; and each page makes sense only in terms of cross- reference to numerous other pages"

What Dawkins neglected in his important work is the concept of the Network. People congregate in cities, mate, and reproduce, thus providing genes with new "survival machines". But Dawkins himself suggested that the new Replicator is the "meme" - an idea, belief, technique, technology, work of art, or bit of information. Memes use human brains as "survival machines" and they hop from brain to brain and across time and space ("communications") in the process of cultural (as distinct from biological) evolution. The Internet is a latter day meme- hopping playground. But, more importantly, it is a Network. Genes move from one container to another through a linear, serial, tedious process which involves prolonged periods of one on one gene shuffling ("sex") and gestation. Memes use networks. Their propagation is, therefore, parallel, fast, and all-pervasive. The Internet is a manifestation of the growing predominance of memes over genes. And the Semantic Web may be to the Internet what Artificial Intelligence is to classic computing. We may be on the threshold of a self-aware Web.

The Internet as a Collective Brain

Drawing a comparison from the development of a human baby - the human race has just commenced to develop its neural system.

The Internet fulfils all the functions of the Nervous System in the body and is, both functionally and structurally, pretty similar. It is decentralized, redundant (each part can serve as functional backup in case of malfunction). It hosts information which is accessible in a few ways, it contains a memory function, it is multimodal (multimedia - textual, visual, audio and animation).

I believe that the comparison is not superficial and that studying the functions of the brain (from infancy to adulthood) - amounts to perusing the future of the Net itself.

1. The Collective Computer
To carry the metaphor of "a collective brain" further, we would expect the processing of information to take place in the Internet, rather than inside the end-user's hardware (the same way that information is processed in the brain, not in the eyes). Desktops will receive the results and communicate with the Net to receive additional clarifications and instructions and to convey information gathered from their environment (mostly, from the user).

This is part fo the philosophy of the JAVA programming language. It deals with applets - small bits of software - and links different computer platforms by means of software. Put differently:
Future servers will contain not only information (as they do today) - but also software applications. The user of an application will not be forced to buy it. He will not be driven into hardware-related expenditures to accommodate the ever growing size of applications. He will not find himself wasting his scarce memory and computing resources on passive storage. Instead, he will use a browser to call a central computer. This computer will contain the needed software, broken to its elements (=applets, small applications). Anytime the user wishes to use one of the functions of the application, he will siphon it off the central computer. When finished - he will "return" it. Processing speeds and response times will be such that the user will not feel at all that it is not with his own software that he is working (the question of ownership will be very blurred in such a world). This technology is available and it provoked a heated debated about the future shape of the computing industry as a whole (desktops - really power packs - or network computers, a little more than dumb terminals). Applications are already offered to corporate users by ASPs (Application Service Providers).

In the last few years, scientists put the combined power of the computers linked to the internet at any given moment to perform astounding feats of distributed parallel processing. Millions of PCs connected to the net co-process signals from outer space, meteorological data and solve complex equations. This is a prime example of a collective brain in action.

2. The Intranet - a Logical Extension of the Collective Computer
LANs (Local Area Networks) are no longer a rarity in corporate offices. WANs (wide Area Networks) are used to connect geographically dispersed organs of the same legal entity (branches of a bank, daughter companies, a sales force). Many LANs are wireless.

The intranet / extranet and wireless LANs will be the winners. They will gradually eliminate both fixed line LANs and WANs. The Internet offers equal, platform-independent, location- independent and time of day - independent access to all the members of an organization.Sophisticated firewall security application protects the privacy and confidentiality of the intranet from all but the most determined and savvy hackers. The Intranet is an inter-organizational communication network, constructed on the platform of the Internet and which enjoys all its advantages. The extranet is open to clients and suppliers as well.

The company's server can be accessed by anyone authorized, from anywhere, at any time (with local - rather than international - communication costs). The user can leave messages (internal e-mail or v-mail), access information - proprietary or public - from it and to participate in "virtual teamwork" (see next chapter).

By the year 2002, a standard intranet interface will emerge. This will be facilitated by the opening up of the TCP/IP communication architecture and its availability to PCs. A billion USD will go just to finance intranet servers - or, at least, this is the median forecast.

The development of measures to safeguard server routed inter- organizational communication (firewalls) is the solution to one of two obstacles to the institution of the Intranet. The second problem is the limited bandwidth which does not permit the efficient transfer of audio (not to mention video). It is difficult to conduct video conferencing through the Internet. Even the voices of discussants who use internet phones come out (slightly) distorted.

All this did not prevent 95% of the Fortune 1000 from installing intranet. 82% of the rest intend to install one by the end of this year. Medium to big size American firms have 50-100 intranet terminals per every internet one. At the end of 1997, there were 10 web servers per every other type of server in organizations. The sale of intranet related software was projected to multiply by 16 (to 8 billion USD) by the year 1999.

One of the greatest advantages of the intranet is the ability to transfer documents between the various parts of an organization. Consider Visa: it pushed 2 million documents per day internally in 1996.

An organization equipped with an intranet can (while protected by firewalls) give its clients or suppliers access to non- classified correspondence. This notion has its charm. Consider a newspaper: it can give access to all the materials which were discarded by the editors. Some news are fit to print - yet are discarded because of space limitations. Still, someone is bound to be interested. It costs the newspaper close to nothing (the material is, normally, already computer-resident) - and it might even generate added circulation and income. It can be even conceived as an "underground, non-commercial, alternative" newspaper for a wholly different readership.

The above is but one example of the possible use of the intranet to communicate with the organization's consumer base.

3. Mail and Chat
The Internet (its e-mail possibilities) is eroding traditional mail. The market share of the post office in conveying messages by regular mail has dwindled from 77% to 62% (1995). E-mail has expanded to capture 36% (up from 19%). 90% of customers with on-line access use e-mail from time to time and 60% work with it regularly. More than 2 billion messages traverse the internet daily.

E-mail applications are available as freeware and are included in all browsers. Thus, the Internet has completely assimilated what used to be a separate service, to the extent that many people make the mistake of thinking that e-mail is a feature of the Internet. Microsoft continues to incorporate previously independent applications in its browsers - a behaviour which led to the 1999 anti-trust lawsuit against it. The internet will do to phone calls what it has done to mail. Already there are applications (Intel's, Vocaltec's, Net2Phone) which enable the user to conduct a phone conversation through his computer. The voice quality has improved. The discussants can cut into each others words, argue and listen to tonal nuances. Today, the parties (two or more) engaging in the conversation must possess the same software and the same (computer) hardware. In the very near future, computer-to-regular phone applications will eliminate this requirement. And, again, simultaneous multi-modality: the user can talk over the phone, see his party, send e-mail, receive messages and transfer documents - without obstructing the flow of the conversation.

The cost of transferring voice will become so negligible that free voice traffic is conceivable in 3-5 years. Data traffic will overtake voice traffic by a wide margin. This beats regular phones.

The next phase will probably involve virtual reality. Each of the parties will be represented by an "avatar", a 3-D figurine generated by the application (or the user's likeness mapped into the software and superimposed on the the avatar). These figurines will be multi-dimensional: they will possess their own communication patterns, special habits, history, preferences - in short: their own "personality". Thus, they will be able to maintain an "identity" and a consistent pattern of communication which they will develop over time.

Such a figure could host a site, accept, welcome and guide visitors, all the time bearing their preferences in its electronic "mind". It could narrate the news, like "Ananova" does. Visiting sites in the future is bound to be a much more pleasant affair.

4. E-cash
In 1996, the four corporate giants (Visa, MasterCard, Netscape and Microsoft) agreed on a standard for effecting secure payments through the Internet: SET. Internet commerce is supposed to mushroom by a factor of 50 to 25 billion USD. Site owners will be able to collect rent from passing visitors - or fees for services provided within the site. Amazon instituted an honour system to collect donations from visitors. Dedicated visitors will not be deterred by such trifles.

5. The Virtual Organization
The Internet allows simultaneous communication between an almost unlimited number of users. This is coupled with the efficient transfer of multimedia (video included) files. This opens up a vista of mind boggling opportunities which are the real core of the Internet revolution: the virtual collaborative ("Follow the Sun") modes.

Examples: A group of musicians will be able to compose music or play it - while spatially and temporally separated; Advertising agencies will be able to co-produce ad campaigns in a real time interactive mode; Cinema and TV films will be produced from disparate geographical spots through the teamwork of people who never meet, except through the net.

These examples illustrate the concept of the "virtual community". Locations in space and time will no longer hinder a collaboration in a team: be it scientific, artistic, cultural, or for the provision of services (a virtual law firm or accounting office, a virtual consultancy network). Two on going developments are the virtual mall and the virtual catalogue.

There are well over 300 active virtual malls in the Internet. They were frequented by 32.5 million shoppers, who shopped in them for goods and services in 1998. The intranet can also be thought of as a "virtual organization", or a "virtual business".

The virtual mall is a computer "space" (pages) in the internet, wherein "shops" are located. These shops offer their wares using visual, audio and textual means. The visitor passes a gate into the store and looks through its offering, until he reaches a buying decision. Then he engages in a feedback process: he pays (with a credit card), buys the product and waits for it to arrive by mail. The manufacturers of digital products (intellectual property such as e-books or software) have begun selling their merchandise on-line, as file downloads.

Yet, slow communications and limited bandwidth - constrain the growth potential of this mode of sale. Once solved - intellectual property will be sold directly from the net, on- line. Until such time, the intervention of the Post Office is still required. So, then virtual mall is nothing but a glorified computerized mail catalogue or Buying Channel, the only difference being the exceptionally varied inventory. Websites which started as "specialty stores" are fast transforming themselves into multi-purpose virtual malls. Amazon.com, for instance, has bought into a virtual pharmacy and into other virtual businesses. It is now selling music, video, electronics and many other products. It started as a bookstore.

This contrasts with a much more creative idea: the virtual catalogue. It is a form of narrowcasting (as opposed to broadcasting): a surgically accurate targeting of potential consumer audiences. Each group of profiled consumers (no matter how small) is fitted with their own - digitally generated - catalogue. This is updated daily: the variety of wares on offer (adjusted to reflect inventory levels, consumer preferences and goods in transit) - and prices (sales, discounts, package deals) change in real time. The user will enter the site and there delineate his consumption profile and his preferences. A customized catalogue will be immediately generated for him. From then on, the history of his purchases, preferences and responses to feedback questionnaires will be accumulated and added to a database. Each catalogue generated for him will come replete with order forms. Once the user concluded his purchases, his profile will be updated.

There is no technological obstacles to implementing this vision today - only administrative and legal ones. Big retail stores are not up to processing the flood of data expected to arrive. They also remain highly sceptical regarding the feasibility of the new medium. And privacy issues prevent data mining or the effective collection and usage of personal data. The virtual catalogue is a private case of a new internet off- shoot: the "smart (shopping) agents". These are AI applications with "long memories".

They draw detailed profiles of consumers and users and then suggest purchases and refer to the appropriate sites, catalogues, or virtual malls.

They also provide price comparisons and the new generation (NetBot) cannot be blocked or fooled by using differing product categories.

In the future, these agents will refer also to real life retail chains and issue a map of the branch or store closest to an address specified by the user (the default being his residence). This technology can be seen in action in a few music sites on the web and is likely to be dominant with wireless internet appliances. The owner of an internet enabled (third generation) mobile phone is likely to be the target of geographically-specific marketing campaigns, ads and special offers pertaining to his current location (as reported by his GPS - satellite Geographic Positioning System).

6. Internet News
Internet news are advantaged. They can be frequently and dynamically updated (unlike static print news) and be always accessible (similar to print news), immediate and fresh. The future will witness a form of interactive news. A special "corner" in the site will be open to updates posted by the public (the equivalent of press releases). This will provide readers with a glimpse into the making of the news, the raw material news are made of. The same technology will be applied to interactive TVs. Content will be downloaded from the internet and be displayed as an overlay on the TV screen or in a square in a special location. The contents downloaded will be directly connected to the TV programming. Thus, the biography and track record of a football player will be displayed during a football match and the history of a country when it gets news coverage.

Terra Internetica - Internet, an Unknown Continent

This is an unconventional way to look at the Internet. Laymen and experts alike talk about "sites" and "advertising space". Yet, the Internet was never compared to a new continent whose surface is infinite.

The Internet will have its own real estate developers and construction companies. The real life equivalents derive their profits from the scarcity of the resource that they exploit - the Internet counterparts will derive their profits from the tenants (the content).

Two examples: A few companies bought "Internet Space" (pages, domain names, portals), developed it and make commercial use of it by:

  • renting it out
  • constructing infrastructure and selling it
  • providing an intelligent gateway, entry point to the rest of the internet
  • or selling advertising space which subsidizes the tenants (Yahoo!-Geocities, Tripod and others).
  • Cybersquatting (purchasing specific domain names identical to brand names in the "real" world) and then selling the domain name to an interested party Internet Space can be easily purchased or created. The investment is low and getting lower with the introduction of competition in the field of domain registration services and the increase in the number of top domains.

Then, infrastructure can be erected - for a shopping mall, for free home pages, for a portal, or for another purpose. It is precisely this infrastructure that the developer can later sell, lease, franchise, or rent out. At the beginning, only members of the fringes and the avant- garde (inventors, risk assuming entrepreneurs, gamblers) invest in a new invention. The invention of a new communications technology is mostly accompanied by devastating silence.

No one knows to say what are the optimal uses of the invention (in other words, what is its future). Many - mostly members of the scientific and business elites - argue that there is no real need for the invention and that it substitutes a new and untried way for old and tried modes of doing the same thing (so why assume the risk?)

These criticisms are usually founded:
To start with, there is, indeed, no need for the new medium. A new medium invents itself - and the need for it. It also generates its own market to satisfy this newly found need. Two prime examples are the personal computer and the compact disc.

When the PC was invented, its uses were completely unclear. Its performance was lacking, its abilities limited, it was horribly user unfriendly. It suffered from faulty design, absent user comfort and ease of use and required considerable professional knowledge to operate. The worst part was that this knowledge was unique to the new invention (not portable).

It reduced labour mobility and limited one's professional horizons. There were many gripes among those assigned to tame the new beast.

The PC was thought of, at the beginning, as a sophisticated gaming machine, an electronic baby-sitter. As the presence of a keyboard was detected and as the professional horizon cleared it was thought of in terms of a glorified typewriter or spreadsheet. It was used mainly as a word processor (and its existence justified solely on these grounds). The spreadsheet was the first real application and it demonstrated the advantages inherent to this new machine (mainly flexibility and speed). Still, it was more (speed) of the same. A quicker ruler or pen and paper. What was the difference between this and a hand held calculator (some of them already had computing, memory and programming features)? The PC was recognized as a medium only 30 years after it was invented with the introduction of multimedia software. All this time, the computer continued to spin off markets and secondary markets, needs and professional specialities. The talk as always was centred on how to improve on existing markets and solutions.

The Internet is the computer's first important breakthrough. Hitherto the computer was only quantitatively different - the multimedia and the Internet have made it qualitatively superior, actually, sui generis, unique.

This, precisely, is the ghost haunting the Internet: It has been invented, is maintained and is operated by computer professionals. For decades these people have been conditioned to think in Olympic terms: more, stronger, higher. Not: new, unprecedented, non-existent. To improve - not to invent. They stumbled across the Internet - it invented itself despite its own creators.

Computer professionals (hardware and software experts alike) - are linear thinkers. The Internet is non linear and modular. It is still the age of hackers. There is still a lot to be done in improving technological prowess and powers. But their control of the contents is waning and they are being gradually replaced by communicators, creative people, advertising executives, psychologists and the totally unpredictable masses who flock to flaunt their home pages.

These all are attuned to the user, his mental needs and his information and entertainment preferences.

The compact disc is a different tale. It was intentionally invented to improve upon an existing technology (basically, Edison's Gramophone). Market-wise, this was a major gamble: the improvement was, at first, debatable (many said that the sound quality of the first generation of compact discs was inferior to that of its contemporaneous record players). Consumers had to be convinced to change both software and hardware and to dish out thousands of dollars just to listen to what the manufacturers claimed was better quality Bach. A better argument was the longer life of the software (though contrasted with the limited life expectancy of the consumer, some of the first sales pitches sounded absolutely morbid). The computer suffered from unclear positioning. The compact disc was very clear as to its main functions - but had a rough time convincing the consumers.

Every medium is first controlled by the technical people. Gutenberg was a printer - not a publisher. Yet, he is the world's most famous publisher. The technical cadre is joined by dubious or small-scale entrepreneurs and, together, they establish ventures with no clear vision, market-oriented thinking, or orderly plan of action. The legislator is also dumbfounded and does not grasp what is happening - thus, there is no legislation to regulate the use of the medium. Witness the initial confusion concerning copyrighted software and the copyrights of ROM embedded software. Abuse or under- utilization of resources grow. Recall the sale of radio frequencies to the first cellular phone operators in the West - a situation which repeats itself in Eastern and Central Europe nowadays.

But then more complex transactions - exactly as in real estate in "real life" - begin to emerge.

This distinction is important. While in real life it is possible to sell an undeveloped plot of land - no one will buy "pages". The supply of these is unlimited - their scarcity (and, therefore, their virtual price) is zero. The second example involves the utilization of a site - rather than its mere availability.

A developer could open a site wherein first time authors will be able to publish their first manuscript - for a fee. Evidently, such a fee will be a fraction of what it would take to publish a "real life" book. The author could collect money for any downloading of his book - and split it with the site developer. The potential buyers will be provided with access to the contents and to a chapter of the books. This is currently being done by a few fledgling firms but a full scale publishing industry has not yet developed.

The Life of a Medium
The internet is simply the latest in a series of networks which revolutionized our lives. A century before the internet, the telegraph, the railways, the radio and the telephone have been similarly heralded as "global" and transforming. Every medium of communications goes through the same evolutionary cycle:

Anarchy

The Public Phase
At this stage, the medium and the resources attached to it are very cheap, accessible, under no regulatory constraints. The public sector steps in: higher education institutions, religious institutions, government, not for profit organizations, non governmental organizations (NGOs), trade unions, etc. Bedevilled by limited financial resources, they regard the new medium as a cost effective way of disseminating their messages.

The Internet was not exempt from this phase which ended only a few years ago. It started with a complete computer anarchy manifested in ad hoc networks, local networks, networks of organizations (mainly universities and organs of the government such as DARPA, a part of the defence establishment, in the USA). Non commercial entities jumped on the bandwagon and started sewing these networks together (an activity fully subsidized by government funds). The result was a globe encompassing network of academic institutions. The American Pentagon established the network of all networks, the ARPANET. Other government departments joined the fray, headed by the National Science Foundation (NSF) which withdrew only lately from the Internet.

The Internet (with a different name) became semi-public property - with access granted to the chosen few. Radio took precisely this course. Radio transmissions started in the USA in 1920. Those were anarchic broadcasts with no discernible regularity. Non commercial organizations and not for profit organizations began their own broadcasts and even created radio broadcasting infrastructure (albeit of the cheap and local kind) dedicated to their audiences. Trade unions, certain educational institutions and religious groups commenced "public radio" broadcasts.

The Commercial Phase
When the users (e.g., listeners in the case of the radio, or owners of PCs and modems in the example of the Internet) reach a critical mass - the business sector is alerted. In the name of capitalist ideology (another religion, really) it demands "privatization" of the medium. This harps on very sensitive strings in every Western soul: the efficient allocation of resources which is the result of competition, corruption and inefficiency naturally associated with the public sector ("Other People's Money" - OPM), the ulterior motives of members of the ruling political echelons (the infamous American Paranoia), a lack of variety and of catering to the tastes and interests of certain audiences, the equation private enterprise = democracy and more. The end result is the same: the private sector takes over the medium from "below" (makes offers to the owners or operators of the medium - that they cannot possibly refuse) - or from "above" (successful lobbying in the corridors of power leads to the appropriate legislation and the medium is "privatized").

Every privatization - especially that of a medium - provokes public opposition. There are (usually founded) suspicions that the interests of the public were compromised and sacrificed on the altar of commercialization and rating. Fears of monopolization and cartelization of the medium are evoked - and justified, in due time. Otherwise, there is fear of the concentration of control of the medium in a few hands. All these things do happen - but the pace is so slow that the initial fears are forgotten and public attention reverts to fresher issues.

A new Communications Act was legislated in the USA in 1934. It was meant to transform radio frequencies into a national resource to be sold to the private sector which will use it to transmit radio signals to receivers. In other words: the radio was passed on to private and commercial hands. Public radio was doomed to be marginalized.

The American administration withdrew from its last major involvement in the Internet in April 1995, when the NSF ceased to finance some of the networks and, thus, privatized its hitherto heavy involvement in the net. A new Communications Act was legislated in 1996. It permitted "organized anarchy". It allowed media operators to invade each other's territories.

Phone companies will be allowed to transmit video and cable companies will be allowed to transmit telephony, for instance. This is all phased over a long period of time - still, it is a revolution whose magnitude is difficult to gauge and whose consequences defy imagination. It carries an equally momentous price tag - official censorship. "Voluntary censorship", to be sure, somewhat toothless standardization and enforcement authorities, to be sure - still, a censorship with its own institutions to boot. The private sector reacted by threatening litigation - but, beneath the surface it is caving in to pressure and temptation, constructing its own censorship codes both in the cable and in the internet media.

Institutionalization
This phase is the next in the Internet's history, though, it seems, unbeknownst to it. It is characterized by enhanced activities of legislation. Legislators, on all levels, discover the medium and lurch at it passionately. Resources which were considered "free", suddenly are transformed to "national treasures not to be dispensed with cheaply, casually and with frivolity". It is conceivable that certain parts of the Internet will be "nationalized" (for instance, in the form of a licensing requirement) and tendered to the private sector. Legislation will be enacted which will deal with permitted and disallowed content (obscenity? incitement? racial or gender bias?) No medium in the USA (not to mention the wide world) has eschewed such legislation. There are sure to be demands to allocate time (or space, or software, or content, or hardware) to "minorities", to "public affairs", to "community business". This is a tax that the business sector will have to pay to fend off the eager legislator and his nuisance value. All this is bound to lead to a monopolization of hosts and servers. The important broadcast channels will diminish in number and be subjected to severe content restrictions. Sites which will not succumb to these requirements - will be deleted or neutralized. Content guidelines (euphemism for censorship) exist, even as we write, in all major content providers (CompuServe, AOL, Geocities, Tripod, Prodigy).

The Bloodbath
This is the phase of consolidation. The number of players is severely reduced. The number of browser types will be limited to 2-3 (Netscape, Microsoft and which else?). Networks will merge to form privately owned mega-networks. Servers will merge to form hyper-servers run on supercomputers in "server farms". The number of ISPs will be considerably cut. 50 companies ruled the greater part of the media markets in the USA in 1983. The number in 1995 was 18. At the end of the century they will number 6.

This is the stage when companies - fighting for financial survival - strive to acquire as many users/listeners/viewers as possible. The programming is shallowed to the lowest (and widest) common denominator. Shallow programming dominates as long as the bloodbath proceeds.

From Rags to Riches

Tough competition produces four processes:
1. A Major Drop in Hardware Prices
This happens in every medium but it doubly applies to a computer-dependent medium, such as the Internet. Computer technology seems to abide by "Moore's Law" which says that the number of transistors which can be put on a chip doubles itself every 18 months. As a result of this miniaturization, computing power quadruples every 18 months and an exponential series ensues. Organic-biological-DNA computers, quantum computers, chaos computers - prompted by vast profits and spawned by inventive genius will ensure the longevity and continued applicability of Moore's Law. The Internet is also subject to "Metcalf's Law". It says that when we connect N computers to a network - we get an increase of N to the second power in its computing / processing power. And these N computers are more powerful every year, according to Moore's Law.

The growth of computing powers in networks is a multiple of the effects of the two laws. More and more computers with ever increasing computing power get connected and create an exponential 16 times growth in the network's computing power every 18 months.

2. Free Availability of Software and Connection This is prevalent in the Net where even potentially commercial software can be downloaded for free. In many countries television viewers still pay for television broadcasts - but in the USA and many other countries in the West, the basic package of television channels comes free of charge. As users / consumers form a habit of using (or consuming) the software - it is commercialized and begins to carry a price tag. This is what happened with the advent of cable television: contents are sold for subscription and usage (Pay Per View - PPV) fees.

Gradually, this is what will happen to most of the sites and software on the Net. Those which survive will begin to collect usage fees, access fees, subscription fees, downloading fees and other, appropriately named, fees. These fees are bound to be low - but it is the principle that counts. Even a few cents per transaction will accumulate to hefty sums with the traffic which will characterize the Net (or, at least its more popular locales).

Adverising revenues will allow ISPs to offer free communication and storage volume. Gradually, connect time charges imposed by the phone companies will be eroded by tough competition from the likes of the cable companies. Accessing the internet might well be free of all charges in 10 years time.

3. Increased User Friendliness
As long as the computer is less user friendly and less reliable (predictable) than television - less of a black box - its potential (and its future) is limited. Television attracts 3.5 billion users daily. The Internet will attract - under the most exuberant scenario - less than one tenth of this number of people. The only reasons for this disparity are (the lack of) user friendliness and reliability. Even browsers, among the most user friendly applications ever - are not sufficiently so. The user still needs to know how to use a keyboard and must possess some basic acquaintance with the operating system.

The more mature the medium, the more friendly it becomes. Finally, it will be operated using speech or common language. There will be room left for user "hunches" and built in flexible responses. 4. Social Taxes
Sooner or later, the business sector has to mollify the God of public opinion by offerings of political and social nature. The Internet is an affluent, educated, yuppie medium. It necessitates a control of the English language, live interest in information and its various uses (scientific, commercial, other), a lot of resources (free time, money to invest in hardware, software and connect time). It empowers - and thus deepens the divide between the haves and have-nots, the knowing and the ignorant, the computer illiterate. In short: the Internet is an elitist medium. Publicly, this is an unhealthy posture. "Internetophobia" is already discernible. People (and politicians) talk about how unsafe the Internet is and about its possible uses for racial, sexist and pornographic purposes. The wider public is in a state of awe.

So, site builders and owners will do well to begin to improve their image: provide free access to schools and community centres, bankroll internet literacy classes, freely distribute contents and software to educational institutions, collaborate with researchers and social scientists and engineers. In short: encourage the view that the Internet is a medium catering to the needs of the community and the underprivileged, a mostly altruist endeavour. This also happens to make good business sense by educating a future generation of users. He who visited a site when a student, free of charge - will pay to do so when made an executive. Such a user will also pass on the information within and without his organization. This is called media exposure. The future will, no doubt, witness public Internet terminals, subsidized ISP accounts, free Internet classes and an alternative "non-commercial, public" approach to the Net.

The Internet: Medium or Chaos?

There has never been a medium like the Internet. The way it has formed, the way it was (not) managed, its hardware- software-communications specifications - are all unique.

No Government
The Internet has no central (or even decentralized) structure. In reality, it hardly has a structure at all. It is a collection of 16 million computers (end 1996) connected through thousands of networks. There are organizations which purport to set Internet standards (like the aforementioned ISOC, or the domain setting ICANN) - but they are all voluntary organizations, with no binding legal, enforcement, or adjudication powers. The result is often mayhem. Many erroneously call the Internet the first democratic medium. Yet, it hardly qualifies as a medium and by no stretch of terminology is it democratic. Democracy has institutions, hierarchies, order. The Internet has none of these things. There are some vague understandings as to what is and is not allowed. This is a "code of honour" (more reminiscent of the Sicilian Mob than of the British Parliament, let's say). Violations are punished by excommunication (of the violating site or person).

The Internet has culture - but no education. Freedom of Speech is entrenched. Members of this virtual community react adversely to ideas of censorship, even when applied to hard core porno. In 1999, hackers hacked major government sites following an FBI initiative against hacking-related crimes. Government initiatives (in the USA, in France, the lawsuit against the General Manager of AOL in Germany) are acutely criticized. In the meantime, the spirit of the Internet prevails: the small man's medium. What seems to be emerging, though, is self censorship by content providers (such as AOL and CompuServe).

Independence
The Internet is not dependent upon a given hardware or software. True, it is accessible only through computers and there are dominant browsers.

But the Internet accommodates any digital (bit transfer) platform. Internet will be incorporated in the future into portable computers, palmtops, PDAs, mobile phones, cable television, telephones (with voice interface), home appliances and even wrist watches. It will be accessible to all, regardless of hardware and software.

The situation is, obviously, different with other media. There is standard hardware (the television set, the radio receiver, the digital print equipment). Data transfer modes are standardized as well. The only variable is the contents - and even this is standardized in an age of American cultural imperialism. Today, one can see the same television programs all over the globe, regardless of cultural or geographical differences.

Here is a reasonable prognosis for the Internet:
It will "broadcast" (it is, of course, a PULL medium, not a PUSH medium - see next chapter) to many kinds of hardware. Its functions will be controlled by 2-5 very common software applications. But it will differ from television in that contents will continue to be decentralized: every point on the Net is a potential producer of content at low cost. This is the equivalent of producing a talk show using a single home video camera. And the contents will remain varied. Naturally, marketing content (sites) will remain an expensive art. Sites will also be richer or poorer, in accordance with the investment made in them.

Non Linearity and Functional Modularity
The Internet is the first medium in human history that is non- linear and totally modular.

A television program is broadcast from a transmitter, through the airwaves to a receiver (=the television set). The viewer sits opposite this receiver and passively watches. This is an entirely linear process. The Internet is different: When communicating through the Internet, there is no way to predict how the information will reach its destination. The routing of information through the network is completely random, very much like the principle governing the telephony system (but on a global scale). The latter is not a point-to- point linear network. Rather, it is a network of networks. Our voice is transmitted back and forth inside a gigantic maze of copper wires and optic fibres. It seeps through any available wire - until it reaches its destination. It is the same with the Internet.

Information is divided to packets. An address is attached to each packet and - using the TCP/IP data transfer protocol - is dispatched to roam this worldwide labyrinth. But the path from one neighbourhood of London to another may traverse Japan. The really ingenious thing about the Internet is that each computer (each receiver or end user) indeed burdens the system by imposing on it its information needs (as is the case with other media) - but it also assists in the task of pushing information packets on to their destinations. It seems that this contribution to the system outweighs the burdens imposed upon it.

The network has a growth potential which is always bigger than the number of its users. It is as though television sets assisted in passing the signals received by them to other television sets. Every computer which is a member of the network is both a message (content) and a medium (active information channel), both a transmitter and a receiver. If 30% of all computers on the Net were to crash - there will be no operational impact (there is enormous built in redundancy). Obviously, some contents will no longer be available (information channels will be affected).

The interactivity of this medium is a guarantee against the monopolization of contents. Anyone with a thousand dollars can launch his/her own (reasonably sophisticated) site, accessible to all other Internet users. Space is available through home page providers.

The name of the game is no longer the production - it is the creative content (design), the content itself and, above all, the marketing of the site.

The Internet is an infinite and unlimited resource. This goes against the grain of the most basic economic concept (of scarcity). Each computer that joins the Internet strengthens it exponentially - and tens of thousands join daily. The Internet infrastructure (maybe with the exception of communication backbones) can accommodate an annual growth of 100% to the year 2020. It is the user who decides whether to increase the Internet's infrastructure by connecting his computer to it. By comparison: it is as though it were possible to produce and to broadcast radio programmes from every radio receiver. Each computer is a combination of studio and transmitter (on the Internet).

In reality, there is no other interactive medium except the Internet. Cable TV does not allow two-way data transfer (from user to cable operator). If the user wants to buy a product - he has to phone. Interactive television is an abject failure (the Sony and TCI experiments were terminated). This all is notwithstanding the combining of the Internet with satellite capabilities (VSAT) or with the revenant digital television. The television screen is inferior when compared to the computer screen. Only the Internet is there as a true two-way possibility. The technological problems that besieged it are slowly dissipating.

The Internet allows for one-dimensional and bi - dimensional interactivity.

One-dimensional interactivity: fill in and dispatch a form, send and receive messages (through e-mail or v-mail). Two-dimensional interactivity: to talk to someone while both parties work on an application, to see your conversant, to talk to him and to transfer documents to him for his perusal as the conversation continues apace.

This is no longer science fiction. In less than five years this will be as common as the telephone - and it will have a profound effect on the traditional services provided by the phone companies. Internet phones, Internet videophones - they will be serious competitors and the phone companies are likely to react once they begin to feel the heat. This will happen when the Internet will acquire black box features. Phone companies, software giants and cable TV operators are likely to end up owning big chunks of the lucrative future market of the Net.

The Solitary Medium
The Internet is NOT a popular medium. It is the medium of affluent executives who fully master the English language, as part of a wider general education.

Alternatively, it is the medium of academia (students, lecturers), or of children of the former, well-to-do group. In any case, it is not the medium of the "wide public". It is also a highly individualistic medium.

The Internet was an initiative of the DOD (Department of Defence in the USA). It was later "requisitioned" by the National science Fund (NSF) in the USA. This continuous involvement of the administration came to an end in 1995 when the medium was "privatized".

This "privatization" was a recognition of the civilian roots of the Internet. It was - and is still being - formed by millions of information-intoxicated users. They formed networks to exchange bits and pieces of mutual interest. Thus, as opposed to all other media, the Internet was not invented, nor was its market. The inventors of the telephone, the telegraph, the radio, the television and the compact disc - all invented previously non-existent markets for their products. It took time, effort and money to convince consumers that they needed these "gadgets".

By contrast, the Internet was invented by its own consumers and so was the market for it. Only when the latter was fully forged did producers and businessmen join in. Microsoft began to hesitantly test the internet waters only in 1995!

On Line Memories
The Internet is the only medium with online memory, very much like the human brain. The memories of these two - the Net and the Brain - are immediately accessible. In both, it is stored in sites and in both, it does not grow old or is eliminated. It is possible to find sites which commemorate events the same way that the human mind registers them. This is Net Memory. The history of a site can be reviewed. The Library of Congress stores the consecutive development phases of sites. The Internet is an amazing combination of data processing software, data, a record of all the activities which took place in connection with the data and the memory of these records. Only the human brain is recalled by these capacities: one language serves all these functions, the language of the neurones.

There is a much clearer distinction even in computers (not to mention more conventional media, such as television). Raw English - the Language of Raw Materials The following - apparently trivial - observation is critical: All the other media provide us with processed, censored, "clean" content.

The Internet is a medium of raw materials, partly well organized (the rough equivalent of a newspaper) - and partly still in raw form, yesterday's supper. This is a result of the immediate and absolute access afforded each user: access to programming and site publishing tools - as well as access to computer space on servers. This leads to varying degrees of quality of contents and content providers and this, in turn, prevents monopolization and cartelization of the information supply channels.

The users of the Internet are still undecided: do they prefer drafts or newspapers. They frequent well designed sites. There are even design competitions and awards. But they display a preference for sites that are constantly updated (i.e. closer in their nature to a raw material - rather than to a finished product). They prefer sites from which they can download material to quietly process at home, alone, on their PCs, at their leisure.

Even the concept of "interactivity" points at a preference for raw materials with which one can interact. For what is interactivity if not the active involvement of the user in the creation of content?

The Internet users love to be involved, to feel the power in their fingertips, they are all addicted to one form of power or another.

Similarly, a car completely automatically driven and navigated is not likely to sell well. Part of the experience of driving - the sensation of power ("power stirring") - is critical to the purchase decision.

It is not in vain that the metaphor for using the Internet is "surfing" (and not, let's say, browsing).

The problem is that the Internet is still predominantly an English language medium (though it is fast changing). It discriminates against those whose mother tongue is different. All software applications work best in English. Otherwise they have to be adapted and fitted with special fonts (Hebrew, Arabic, Japanese, Russian and Chinese - each present a different set of problems to overcome). This situation might change with the attainment of a critical mass of users (some say, 2 million per non-Anglophone country).

Comprehensive (Virtual) Reality
This is the first (though, probably, not the last) medium which allows the user to conduct his whole life within its boundaries.

Television presents a clear division: there is a passive viewer. His task is to absorb information and subject it to minimal processing. The Internet embodies a complete and comprehensive (virtual) reality, a full fledged alternative to real life.

The illusion is still in its infancy - and yet already powerful.

The user can talk to others, see them, listen to music, see video, purchase goods and services, play games (alone or with others scattered around the globe), converse with colleagues, or with users with the same hobbies and areas of interest, to play music together (separated by time and space). And all this is very primitive. In ten years time, the Internet will offer its users the option of video conferencing (possibly, three dimensional, holographic). The participants' figures will be projected on big screens. Documents will be exchanged, personal notes, spreadsheets, secret counteroffers. Virtual Reality games will become reality in less time. Special end-user equipment will make the player believe that he, actually, is part of the game (while still in his room). The player will be able to select an image borrowed from a database and it will represent him, seen by all the other players. Everyone will, thus, end up invading everyone else's private space - without encroaching on his privacy! The Internet will be the medium of choice for phone and videophone communication (including conferencing). Many mundane activities will be done through Internet: banking, shopping for standard items, etc.

The above are examples to the Internet's power and ability to replace our reality in due time. A world out there will continue to exist - but, more and more we will interact with it through the enchanted interface of the Net.

A Brave New Net

The future of a medium in the making is difficult to predict. Suffice it to mention the ridiculous prognoses which accompanied the PC (it is nothing but a gaming gadget, it is a replacement for the electric typewriter, will be used only by business). The telephone also had its share of ludicrous statements: no one - claimed the "experts" would like to avoid eye contact while talking. Or television: only the Nazi regime seemed to have fully grasped its potential (in the Berlin 1936 Olympics). And Bill Gates thought that the internet has a very limited future as late as 1995!!!

Still, this medium has a few characteristics which differentiate it from all its predecessors. Were these traits to be continuously and creatively exploited - a few statements can be made about the future of the Net with relative assurance.

Time and Space Independence
This is the first medium in history which does not require the simultaneous presence of people in space-time in order to facilitate the transfer of information. Television requires the existence of studio technicians, narrators and others in the transmitting side - and the availability of a viewer in the receiving side. The phone is dependent on the existence of two or more parties simultaneously.

With time, tools to bridge the time gap between transmitter and receiver were developed. The answering machine and the video cassette recorder both accumulate information sent by a transmitter - and release it to a receiver in a different space and time. But they are discrete, their storage volume is limited and they do not allow for interaction with the transmitter.

The Internet does not have these handicaps. It facilitates the formation of "virtual organizations / institutions / businesses/ communities". These are groups of users that communicate in different points in space and time, united by a common goal or interest.

A few examples:
The Virtual Advertising Agency
A budget executive from the USA will manage the account of a hi-tech firm based in Sydney. He will work with technical experts from Israel and with a French graphics office. They will all file their work (through the intranet) in the Net, to be studied by the other members of this virtual group. These will enter the right site after clearing a firewall security software. They will all be engaged in flexiwork (flexible working times) and work from their homes or offices, as they please. Obviously, they will all abide by a general schedule. They will exchange audio files (the jingle, for instance), graphics, video, colour photographs and text. They will comment on each other's work and make suggestions using e- mail. The client will witness the whole creative process and will be able to contribute to it. There is no technological obstacle preventing the participation of the client's clients, as well.

Virtual Rock'n'Roll
It is difficult to imagine that "virtual performances will replace real life ones.

The mass rock concert has its own inimitable sounds, palette and smells. But a virtual production of a record is on the cards and it is tens of percents cheaper than a normal production. Again, the participants will interact through the Intranet. They will swap notes, play their own instruments, make comments by e-mail, play together using an appropriate software. If one of them is grabbed by inspiration in the middle of (his) night, he will be able to preserve and pass on his ideas through the Net. The creative process will be aided by novel applications which enable the simultaneous transfer of sound over the Net. The processes which are already digitized (the mix, for one) will pose no problem to a digitized medium. Other applications will let the users listen to the final versions and even ask the public for his preview opinion.

Thus, even creative processes which are perceived as demanding human presence - will no longer do so with the advent of the Net.

Perhaps it is easier to understand a Virtual Law Firm or Virtual Accountants Office. In the extreme, such a firm will not have physical offices, at all. The only address will be an e-mail address. Dozens of lawyers from all over the world with hundreds of specialities will be partners in such an office. Such an office will be truly multinational and multidisciplinary. It will be fast and effective because its members will electronically swap information (precedents, decrees, laws, opinions, research and plain ideas or professional experience).

It will be able to service clients in every corner of the globe. It will involve the transfer of audio files (NetPhones), text, graphics and video (crucial in certain types of litigation). Today, such information is sent by post and messenger services. Whenever different types of information are to be analysed - a physical meeting is a must. Otherwise, each type of information has to be transferred separately, using unique equipment for each one.

Simultaneity and interactivity - this will be the name of the game in the Internet. The professional term is "Coopetition" (cooperation between potential competitors, using the Internet).

Other possibilities: a virtual production of a movie, a virtual research and development team, a virtual sales force. The harbingers of the virtual university, the virtual classroom and the virtual (or distance) medical centre are here.

The Internet - Mother of all Media
The Internet is the technological solution to the mythological "home entertainment centre" debate.

It is almost universally agreed that, in the future, a typical home will have one apparatus which will give it access to all types of information. Even the most daring did not talk about simultaneous access to all the types of information or about full interactivity.

The Internet will offer exactly this: access to every conceivable type of information simultaneously , the ability to process them at the same time and full interactivity. The future image of this home centre is fairly clear - it is the timing that is not. It is all dependent on the availability of a wide (information) band - through which it will be possible to transfer big amounts of data at high speeds, using the same communications line. Fast modems were coupled with optic fibres and with faulty planning and vision of future needs. The cable television industry, for instance, is totally technologically unprepared for the age of interactivity. This is only partly the result of unwise, restrictive, legislation which prohibits data vendors from stepping on each others' toes. Phone companies were not permitted to provide Internet services or to transfer video through their wires - and cable companies were not allowed to transmit phone calls. It is a question of time until these fossilized remains are removed by the almighty hand of the market. When this happens, the home centre is likely to look like this:
A central computer attached to a big screen divided to windows. Television is broadcast on one window. A software application is running on another. This could be an application connected to the television program (deriving data from it, recording it, collating it with pertinent data it picks out of databases). It could be an independent application (a computer game).

Updates from the New York Stock exchange flash at the corner of the screen and an icon blinks to signal the occurrence of a significant economic event.

A click of the mouse (?) and the news flash is converted to a voice message. Another click and your broker is on the InternetPhone (possibly seen in a third window on the screen). You talk, you send him a fax containing instructions and you compare notes. The fax was printed on a word processing application which opened up in yet another window.

Many believe that communication with the future generation of computers will be voice communication. This is difficult to believe. It is weird to talk to a machine (especially in the presence of other humans). We are seriously inhibited this way. Moreover, voice will interrupt other people's work or pleasure. It is also close to impossible to develop an efficient voice recognition software. Not to mention mishaps such as accidental activation.

The Friendly Internet
The Internet will not escape the processes experienced by all other media.

It will become easy to operate, user-friendly, in professional parlance.

It requires too much specialized information. It is not accessible to those who lack basic hardware and (Windows) software concepts.

Alas, most of the population falls into the latter category. Only 30 million "Windows" operating systems were sold worldwide at the end of 1996. Even if this constitutes 20% of all the copies (the rest being pirated versions) - it still represents less than 3% of the population of the world. And this, needless to say, is the world's most popular software (following the DOS operating system).

The Internet must rely on something completely different. It must have sophisticated, transparent-to-the-user search engines to guide to the cavernous chaotic libraries which will typify it. The search engines must include complex decision making algorithms. They must understand common languages and respond in mundane speech. They will be efficient and incredibly fast because they will form their own search strategy (supplanting the user's faulty use of syntax). These engines, replete with smart agents will refer the user to additional data, to cultural products which reflect the user's history of preferences (or pronounced preferences expressed in answers to feedback questionnaires). All the decisions and activities of the user will be stored in the memory of his search engine and assist it in designing its decision making trees. The engine will become an electronic friend, advise the user, even on professional matters.

Cease-Fire
The cessation of hostilities between the Internet and some off-the-shelf software applications heralds the commencement of the integration between the desktop computer and the Net. This is a small step for the user - and a big one for humanity. The animosity which prevailed until recently between the UNIX systems and the HTML language and between most of the standard applications (headed by the Word Processors) - has officially ended with the introduction of Office 97 which incorporates full HTML capabilities. With the Office 2000 products, the distinctions between a web computing environment and a PC computing one - have all but vanished. Browsers can replace operating systems, word processors can browse, download and upload - the PC has finally been entirely absorbed by its offspring, the internet.

The Portable Document Format (PDF) enables the user to work the Internet off-line. In other words: text files will be loaded to word processors and edited off-line. The same applies to other types of files (audio, video). Downloading time will be speeded up (today, it takes so long to download an audio or video file that, many times, it is impracticable).

This is not a trivial matter. The ability to switch between on-line and off-line states and to continue the work, uninterrupted - this ability means the integration of the PC in the Internet.

There are two competing views concerning the future of computer hardware and both of them acknowledge the importance of the Internet.

Bill Gates - Microsoft's legendary boss - says that the PC will continue to advance and strengthen its processing and computing powers. The Internet will be just another tool available through telecommunications, rather than through the ownership of hard copies of software and data. The Internet is perceived to be a tremendous external database, available for processing by tomorrow's desktops. This view is lately being gradually reversed in view of the incredible vitality and powers of the Internet.

Gates is converging on the worldview held by Sun Microsystems. The future desktop will be a terminal, albeit powerful and with considerable processing, computing and communications capabilities. The name of the game will be the Internet itself. The terminal will access Internet databases (containing raw or processed data) and satisfy its information needs.

This terminal - equipped with languages the likes of Java - will get into libraries of software applications. It will make use of components of different applications as the needs arise. When finished using the component, the terminal will "return" it to the virtual "shelf" until the next time it is needed.

This will minimize memory resources in the desktop. The truth, as always, is probably somewhere in the middle. Tomorrow's computer will be a home entertainment centre. No consumer will accept total dependence on telecommunications and on the Net. They will all ask for processing and computing powers at their fingertips, a-la Bill Gates.

But tomorrow's computer will also function as a terminal, when needed: when data retrieving or even when using NON standard software applications. Why purchase rarely used, expensive applications - when they are available, for a fraction of the cost, on the Net?

In other words: no consumer will subjugate his frequent word processing needs to the whims of the local phone company, or to those of the site operator. That is why every desktop is still likely to be include a hard (or optical)-disk-resident word processing software. But very few will by CAD-CAM, animation, graphics, or publishing software which they are likely to use infrequently. Instead, they will access these applications, which will be resident in the Net, use those parts that are needed. This is usage tailored to the client's needs. This is also the integration of a desktop (not of a terminal) with the Net.

Decentralized Lack of Planning
The course adopted by content creators (producers) in the last few years proves the maxim that it is easy to repeat mistakes and difficult to derive lessons from them. Content producers are constantly buying channels to transfer their contents. This is a mistake. A careful study of the history of successful media (e.g., television) points to a clear pattern: Content producers do not grant life-long exclusivity to any single channel. Especially not by buying into it. They prefer to contract for a limited time with content providers (their broadcast channels). They work with all of them, sometimes simultaneously.

In the future, the same content will be sold on different sites or networks, at different times. Sometimes it will be found with a provider which is a combination of cable TV company and phone company - at other times, it will be found with a provider with expertise in computer networks. Much content will be created locally and distributed globally - and vice versa. The repackaging of branded contents will be the name of the game in both the media firms and the firms which control contents distribution (=the channels).

No exclusivity pact will survive. Networks such as CompuServe are doomed and have been doomed since 1993. The approach of decentralized access, through numerous channels, to the same information - will prevail.

The Transparent Language
The Internet will become the next battlefield between have countries and have-not countries. It will be a cultural war zone (English against French, Japanese, Chinese, Russian and Spanish). It will be politically charged: those wishing to restrict the freedom of speech (authoritarian and dictatorial regimes, governments, conservative politicians) against pro- speechers. It will become a new arena of warfare and an integral part of actual wars.

Different peer groups, educational and income social-economic strata, ethnic, sexual preference groups - will all fight in the eternal fields of the Internet.

Yet, two developments are likely to pacify the scene: Automatic translation applications (like Accent and the Alta Vista translation engines) will make every bit of information accessible to all. The lingual (and, by extension ethnic or national) source of the information will be disguised. A feeling of a global village will permeate the medium. Being ignorant of the English language will no longer hinder one's access to the Net. Equal opportunities.

The second trend will be the new classification methods of contents on the Net together with the availability of chips intended to filter offensive information. Obscene material will not be available to tender souls. anti-Semitic sites will be blocked to Jews and communists will be spared Evil Empire speeches. Filtering will be usually done using extensive and adaptable lists of keywords or key phrases.

This will lead to the formation of cultural Internet Ghettos - but it will also considerably reduce tensions and largely derail populist legislative efforts aimed at curbing or censoring free speech.

Public Internet - Private Internet
The day is not far when every user will be able to define his areas of interest, order of priorities, preferences and tastes. Special applications will scour the Net for him and retrieve the material befitting his requirements. This material will be organized in any manner prescribed. A private newspaper comes to mind. It will have a circulation of one copy - the user's. It will borrow its contents from a few hundreds of databases and electronic versions of newspapers on the Net. Its headlines will reflect the main areas of interest of its sole subscriber. The private paper will contain hyperlinks to other sites in the Internet: to reference material, to additional information on the same subject. It will contain text, but also graphics, audio, video and photographs. It will be interactive and editable with the push of a button.

Another idea: the intelligent archive. The user will accumulate information, derived from a variety of sources in an archive maintained for him on the Net. It will not be a classical "dead" archive. It will be active. A special application will search the Net daily and update the archive. It will contain hyperlinks to sites, to additional information on the Net and to alternative sources of information. It will have a "History" function which will teach the archive about the preferences and priorities of the user.

The software will recommend new sites to him and subjects similar to his history. It will alert him to movies, TV shows and new musical releases - all within his cultural sphere. If convinced to purchase - the software will order the wares from the Net. It will then let him listen to the music, see the movie, or read the text.

The internet will become a place of unceasing stimuli, of internal order and organization and of friendliness in the sense of personally rewarding acquaintance. Such an archive will be a veritable friend. It will alert the user to interesting news, leave messages and food for thought in his e-mail (or v-mail). It will send the user a fax if not responded to within a reasonable time. It will issue reports every morning.

This, naturally, is only a private case of the archival potential of the Net.

A network connecting more than 16.3 million computers (end 1996) is also the biggest collective memory effort in history after the Library of Alexandria. The Internet possesses the combined power of all its constituents. Search engines are, therefore, bound to be replaced by intelligent archives which will form universal archives, which will store all the paths to the results of searches plus millions of recommended searches.

Compare this to a newspaper: it is much easier to store back issues of a paper in the Internet than physically. Obviously, it is much easier to search and the amortization of such a copy is annulled. Such an archive will let the user search by word, by key phrase, by contents, search the bibliography and hop to other parts of the archive or to other territories in the Internet using hyperlinks.

Money, Again

We have already mentioned SET, the safety standard. This will facilitate credit card transactions over the Net. These are safe transactions even today - but there an ingrained interest to say otherwise. Newspapers are afraid that advertising budgets will migrate to the Web. Television harbours the same fears. More commerce on the Net - means more advertising dollars diverted from established media. Too many feel unhappy when confronted with this inevitability. They spread lies which feed off the ignorance about how safe paying with credit cards on the Net is. Safety standards will terminate this propaganda and transform the Internet into a commercial medium.

Users will be able to buy and sell goods and services on the Net and get them by post. Certain things will be directly downloaded (software, e-books). Many banking transactions and EDI operations will be conducted through bank-clients intranets. All stock and commodity exchanges will be accessible and the role of brokers will be minimized. Foreign exchange will be easily tradable and transferable. Initial Public Offerings of shares, day trading of stocks and other activities traditionally connected with physical ("pit") capital markets will become a predominant feature of the internet. The day is not far that the likes of Merill Lynch will be offering full services (including advisory services) through the internet. The first steps towards electronic trading of shares (with discounted fees) have already been taken in mid 1999. Home banking, private newspapers, subscriptions to cultural events, tourism packages and airline tickets - are all candidates for Net-Trading. The Internet is here to stay.

Commercially, it would be an extreme strategic error to ignore it. A lot of money will flow through it. A lot more people will be connected to it. A lot of information will be stored on it.

It is worth being there. Published by "PC World" in Tel-Aviv on April 1996. Partially Revised: 7/00.

Appendix - Ethics and the Internet

The "Internet" is a very misleading term. It's like saying "print". Professional articles are "print" - and so are the sleaziest porno brochures.

So, first, I think it would be useful to make a distinction between two broad categories:
Content-related
or
Content-driven and Interaction-driven

Most content driven sites maintain reasonable ethical standards, roughly comparable to the "real" or "non-virtual" media. This is because many of these sites were established by businesses with a "real" dimension to start with (Walt Disney, The Economist, etc.). These sites (at least the institutional ones) maintain standards of privacy, veracity, cross-checking of information, etc.

Personal home pages would be a sub-category of content-driven sites. These cannot be seriously considered "media". They are representatives of the new phenomenon of extreme narrowcasting. They do not adhere to any ethical standards, with the exception of those upheld by their owners'. The interaction orientated sites and activities can, in turn, be divided to E-commerce sites (such as Amazon) which adhere to commercial law and to commercial ethics and to interactive sites.

The latter - discussion lists, mailing lists and so on - are a hotbed of unethical, verbally aggressive, hostile behaviour. A special vocabulary developed to discuss these phenomena ("flaming", "mail bombing" etc.).

To summarize:
Where the aim is to provide consumers with another venue for the dissemination of information or to sell products or services to them the standards of ethics maintained reflect those upheld outside the realm of the internet. Additionally, codified morals, the commercial law is adhered to.

Where the aim is interaction or the dissemination of the personal opinions and views of site-owners - ethical standards are in the process of becoming. A rough set of guidelines coalesced into the "netiquette". It is a set of rules of peaceful co-existence intended to prevent flame wars and the eruption of interpersonal verbal abuse. Since it lacks effective means of enforcement - it is very often violated and constitutes an expression of goodwill, rather than an obliging code.