Tuesday, August 23, 2011

Multimedia Broadcasting and the Internet

Introduction
 
There is a synergy between broadcasting and the Internet which may well be a critical success factor for future growth. Clearly, the Internet has been on a fast growth curve. As we look forward, a number of questions come to mind regarding the future of the Internet as more companies target the consumer market. How far will Internet-based services penetrate into the consumer marketplace? What are the most feasible ways to make Internet resources and services available to the most people? How far can we expect people to change current information and entertainment behaviors?
Even with all the speculation and initial success of the Internet and online services generally, the main conduit into our lives for electronically distributed information and entertainment will remain radio and television broadcasting. In terms of hours of consumption and universal availability, the broadcast infrastructure is the major electronic pipeline for individuals. Broadcasting services are available through various platforms--wired and wireless, mobile and fixed site. The familiar form broadcasting now takes will undergo a metamorphosis which will seamlessly integrate with other parts of the information infrastructure to offer exciting new services. This can help extend and enhance the current means for providing access to Internet resources.
Broadcasting as we think of it today provides for the transport of linear audio and video programming in real time. The basic premise of the broadcasting model has not changed in more than 70 years. We turn on our TVs and radios at particular moments in time to gain access to one program at a time per channel with prescheduled start and end times. We expect radio programming to be audio and television programming to be video. Other than the ability to control whether the receivers are on or off and frequency tuning, we have no expectations of affecting what is presented to us on our speakers and screens. Virtually all of the content offered by radio and TV stations today is available only in this manner.
The move to multimedia broadcasting substantially changes this premise. Multimedia broadcasting will develop in three ways. First, even today's analog transmission technology supports the transport of multiple data types. This means that more than the traditional real-time, linear and pre-scheduled forms of audio and video programming will be available. Second, while some of this data will be related to the main channel programming (i.e., conventional radio and television programming), other data wholly unrelated to conventional programming will be transported. And third, broadcast applications will interoperate with other nonbroadcast client-server applications.
Prospects for new and expanded market segments to be developed by multimedia broadcast and Internet-related services are analyzed here. An analysis of the convergence marketplace in term of the enabling technology, policy, markets, and business missions is presented. Models of how radio and television broadcasters can organize around the opportunities enabled by the Internet are considered. Examples of current and planned broadcaster activities and services are presented. Finally, some thoughts on what it will take to succeed are offered.

The convergence marketplace

The convergence marketplace is typically presented as a more or less technology-driven collision among the entertainment, telecommunications, computer, and media marketplaces. Evidence of this can be seen as companies in these various industries engage in buyouts, mergers, and alliances. On this level, convergence basically means that it is harder to tell the players apart as they tend to start wearing the same company uniforms.
With the notion of convergence there is a habit of focusing on the technology. This precipitates relatively futile polemics like: "Are TVs becoming more like computers or are computers becoming more like TVs?" (By observation, the answer appears to be "Yes.") However, there are at least four formative factors in the convergence marketplace. These factors are (1) technology, (2) policy, (3) markets, and (4) business missions. Beyond this, one might speak of six types of structural economic factors of supply (content, processing, distribution) and demand (choice, convenience, control).

The technology of convergence

One might characterize convergence at the technology level with four essential criteria to describe an idealized infrastructure. At the technology level, we are talking about that which supports or is characterized by: (1) open systems; (2) layered architectures; (3) digital signal processing and (4) platform independence (i.e., true connectivity and interoperability). "Open systems," or those with publicly defined protocols offer scale and scope advantages to the marketplace. Closed or proprietary systems, while securing private advantage, may unnecessarily limit the market (unless the proprietary technology can become a de facto standard).
The enabling technology of convergence is layered architecture which supports connectivity and interoperability. This is the electronic and logical glue which binds the various parts and pieces of the information infrastructure. This ranges from telephone lines, cable television, remote control devices, computers, TVs, set top boxes, modems, games, software, application interfaces, and so on.
The International Standards Organization's (ISO) Open Systems Interconnection (OSI) reference model has proven a useful tool for proprietary and public implementations of layered architectures. The ISO's OSI reference model defines seven layers ranging from physical connections to transport to logical applications. The seven layers are: (1) physical, (2) data link, (3) network, (4) transport, (5) session, (6) presentation, and (7) application.
While the OSI reference model is admittedly an idealized architecture, it has proven useful to those developing proprietary layers which must work with publicly defined layers. For example, while WinSock is a publicly defined application program interface (API), various proprietary implementations of Transmission Control Protocol/Internet Protocol (TCP/IP) stacks must interoperate with this presentation layer. The TCP/IP architecture consists of four layers, the (1) network access, (2) Internet, (3) host-to-host transport, and (4) application layers. While each of these layers has a public definition there are proprietary improvements in this TCP/IP protocol architecture.
The philosophy of layered architecture models is that implementation of each layer can occur relatively independent of the other layers. In other words, developers need not worry about the whole vertical stack of layers but can focus resources on just the subset of layers most interesting to their product or service. As a matter of technology, this opens up many new possibilities. Until recently, policy constrained the marketplace from developing much of the promise technology offers. For example, while cable systems could connect to and interoperate with telephone exchanges to provide voice transport, this was not permitted in most states.

Public policy and marketplace convergence

As with technology, public policy can raise or lower barriers to marketplace entry and performance. The Telecommunications Act of 1996 is now the main policy lowering or eliminating many of the barriers to a convergence marketplace (visit http://www.eff.org/pub/Alerts/s652_hr1555_96.act, which is the Electronic Freedom Foundation's Web site to see a version of this act). In particular, cable companies are permitted into the telecommunications services marketplace, telephone companies may enter into the video services marketplace as "open video systems" common carriers, and television broadcasters may enter into the provision of digitally delivered services ancillary and supplementary to the main television service.
A key potential barrier for television broadcasters is which spectrum plan the government will enact. New spectrum is required for broadcasters to begin digital service. For a transition period, both analog and digital transmission would operate in parallel before phasing out the analog service. Some government proposals would require broadcaster payment for the spectrum to be used for digital transmissions. Broadcasters currently pay no spectrum fees for analog services. The quid pro quo for no spectrum fee has always been free over-the-air service in the public interest, convenience, and necessity. Spectrum fees would inevitably impact this economic model of broadcasting.
As digital signal transport providers, radio and television broadcasters can actively participate not only in the content segment of the marketplace but also in the transport and processing segments. "Broadcasting" as we commonly know it today will change. There is nothing inherent to the technology or the public policy that confines "radio" to be talk shows, news, sports, weather, and music. Nor are there similar limits constraining "television" to be syndicated programs, news, movies, or sports. "Broadcasting" will become a means of digitally transporting multimedia objects, some of which will be resolved by standard receivers as traditional broadcast fare. However, other bits will be intended for receipt by computers, personal digital assistants, and other smart devices. This includes HTML pages and other Internet resources. Policy will permit these "ancillary" private data uses. Now the question is, to what extent will entrepreneurial and market forces encourage and support broadcaster entry into the multimedia marketplace.

Markets and business missions

As technology literacy rises, so will product and service expectations. Consumers are becoming less satisfied with the current crop of offerings in terms of price/performance and ease of use. Consumers are trained to continually expect more performance and power features for less money and effort. As younger and more technically capable consumers become of age, this trend will accelerate. The marketplace will demand ever more sophisticated, cost-effective (which is not necessarily the same as cheap--people will spend money to save time), and convenient to use services and products.
That's the demand side. The supply side is seeing an influx of would-be success stories. The mix of products and services, while exploding, is beginning to rely on similar and usually compatible technologies. This allows the entry of more kinds of companies. They have but to redefine their business missions to open up whole new marketplaces. For example, utility companies around the world are using "new media math" to learn about economies of scope. They are beginning to see that their public rights of way, copper lines to every household, billing systems, and technical staff can be redeployed to encompass an entirely new marketplace--cable television! Similarly, cable is getting into telephone, telephone is getting into video, and everybody is getting into the Internet.

Economic structure of convergence--supply and demand

On the supply side, convergence has three product and service groups--content, transport, and processing. The essence of convergence is that the marketplace can supply products and services in any of the groups independent of the other two. In other words, content developers need only worry about content, not how it will be stored, processed, transported, compressed, retrieved, or linked and embedded. Those in the transport business should have to worry only about hauling bits and should not have to be concerned with the underlying logic, applications functionality or data types which are encoded in any bit stream. And finally, processing (read/write, compression, error detection/correction, etc.) should be possible regardless of the transport mode or data type. While this initial description of a convergence marketplace is technocentric, it has enormous implications for the structure and conduct of previously distinct industry segments.
Of course, companies vertically integrating across these stages of production can achieve economies of scale and scope. This realization has fueled much of the merger and acquisition activity in the telecommunications and media segments. Economies of scale refer to the ability to reduce average unit costs by expanding the size of operations to produce one product. Economies of scope refer to a firm's ability to produce two or more products at a lower cost than two separate firms could by taking advantage of synergies.
Companies primarily involved in the transport business see that their product, bandwidth, will fall prey to commodity pricing as the implications of connectivity and interoperability hit home. In other words, if a content provider is trying to move a software application, say an interactive video game, from a server (point A) to a client in a consumer household (point B), there are a number of competitive modes of transport which offer essentially the same bit hauling service. Why should not this content provider simply go with the cheapest price for hauling bits from Point A to Point B? On the other hand, content and processing products and services are likely to be priced on value-added concepts ("My video game is more compelling or exciting than yours").
Consumers in the convergence marketplace tend to be looking for value in at least three major demand areas: choice, convenience, and control. Consumers want choice in content, applications, transactions. They want control. They want to be able to access and move information effectively and securely. This has implications for interfaces, browsers, navigation systems, search engines, etc. And consumers want convenience. They want to be able to use information and entertainment products and services whenever the urge strikes them, wherever they are.

Broadcasting and the convergence marketplace

The broadcasting industry has a role in all three venues of the convergence marketplace--content, distribution, and processing. In terms of content, broadcasting is already the dominant supplier of electronic content to the public. The average American consumes over 2,000 hours of radio and television programming each year. Admittedly, this is so-called "linear" or real-time, one-way programming, with limited processing and control options. Yet, clearly this programming satisfies a market demand. This compares to less than 90 hours of online time.
Utilizing new processing technologies, much of this content repository can be repurposed to become interactive multimedia content. In fact, several of the major networks and broadcast groups are doing just this with their news and entertainment intellectual property. This content can certainly be served in the form of digital signals transmitted over the airwaves. Additionally, in the spirit of convergence, the content can also be served via Internet connections, particularly the World Wide Web. Over 400 commercial television stations and nearly 1,000 radio stations have established their own Web sites and are actively developing content to be published in Hypertext Markup Language (HTML) format for online access.
Moving beyond the content role, broadcasting also has a role in the distribution segment of the convergence marketplace. The broadcast infrastructure is particularly well-suited as a transport component in a client-server architecture. There is a natural asymmetry in the bandwidth requirements for the client-server applications. Client requests are usually narrow bandwidth whereas responses from servers tend to consume more bandwidth.
For example, in a Web browser, a single click on a hotlink often precipitates megabytes of transfer as HTML pages, files, or multimedia objects move from the server to the requesting client. By utilizing high-speed digital broadcast signals for the downstream portion of some services, in combination with other media for the back channel, broadcasting can become a value-added participant in the Internet services marketplace by adding what is essentially a ubiquitous, wireless backbone with up to 20 megabits per second of downstream capacity.
In terms of processing, by empowering the consumer with choice, control, and convenience, broadcasters create value in the multimedia marketplace. The business proposition broadcasters now offer the consumer is becoming outmoded. Essentially, the proposition is that "we have some content we think you will like. We'll give it to you for free but you have to view it or listen to it when we decide to broadcast it. If you miss it and didn't record it, you'll have to either wait until the re-run season or live without it." This does not give the consumer as much choice, control, or convenience as emerging alternatives. Broadcasters are now beginning to explore new ways to use both analog and digital transmissions in conjunction with the Internet to enrich this basic business proposition.

Multimedia broadcasting

Multimedia broadcasting or datacasting refers to the use of the existing broadcast infrastructure to transport digital information to a variety of devices (not just PCs). While the existing infrastructure of broadcast radio and television uses analog transmissions, digital signals can be transmitted on subcarriers or subchannels. Also, both the radio and television industries have begun a transition to digital transmissions.
Multimedia broadcasting will be developed in three basic dimensions. First, datacasting supports the transport of multiple data types. This means that more than the traditional real-time, linear, and prescheduled forms of audio and video programming will be available. Broadcast programming will become richer and more involving by expanding its creative palette to encompass different data types and leveraging the processing power of intelligent receivers and PCs on the client side. Second, while some of this data will be related to the main channel programming (i.e., conventional radio and television programming), other data wholly unrelated to conventional programming will be transported. And third, broadcast applications will interoperate seamlessly with other non-broadcast client-server applications such as World Wide Web sessions.
The essential characteristics of multimedia broadcasting include:
  • digital data stream;
  • asynchronous;
  • bandwidth asymmetry;
  • downstream backbone;
  • high speed (up to 20 Mbps);
  • universal access;
  • low cost;
  • layered architecture;
  • wireless;
  • mobile and fixed service; and
  • existing infrastructure.
As noted earlier, most client-server applications are asymmetric in bandwidth requirements. As such, the publicly switched telephone network, through which most Internet users get connected, is an inferior transport solution. State-of-the-art modems achieve a 28.8 kbps bidirectional performance. For multimedia content, including streaming audio and video, this is not very satisfactory.
Hoping to capitalize on the presumed demand for more bandwidth to the home, both cable and telephone companies actively plotted fiber optic build-outs in their distribution plant. This would have supported broadband capacity to and from the home. Given the market turn away from video-on-demand products (at least for the moment) and towards the Internet, a new economic picture emerges.
There is no particular reason to optimize a network for symmetric performance (as with a fiber optics build-out), when upstream and downstream bandwidth can be allocated more efficiently with existing infrastructures. Indeed, the telephone industry's promising answer to repurposing much of their existing copper plant is a rejuvenating technology known as asymmetric digital subscriber line (ADSL). This technology supports a minimum configuration of a downstream data rate of at least 1.544 Mbps per second (e.g., equivalent to a T-1 rate), and an upstream data rate of 16 kbps. (For further information, visit the ADSL Forum at http://www.sbexpos.com/adsl/adsl_tech.html.)
Television broadcasters could transmit data such as HTML pages, Java applets, and virtual Reality Modeling Language (VRML), streaming audio and video or binary file transfers in their broadcast signals for capture and display by smart TV receivers, PCs, personal digital assistants, or network computers. Consumers could select from this data stream only that which is interesting to them. Further, they could activate agents or filters to grab and store particular resources of interest to them. This data could be accessed later, at their convenience, rather than having to grab it and display it in real time from the data stream being currently broadcast.
This model so far assumes a one-way data stream. One interesting thing about this scenario is that from a technical viewpoint it is one-way. However, given the choice and control, it appears to be interactive to the client. This adds value. Another interesting thing is that since the telecommunications technology supporting this service is one-way, it can be available to remote and mobile users. There is no need to be tethered to any kind of wire to enjoy the benefits of this implementation of Internet service.
Of course, consumers will want more data than the broadcast station can preselect for transmission over their facilities. The service model can be extended in a couple of ways. First, using a return link like a telephone (wired or wireless), the consumer can request a specific set of resources to be transmitted. Depending on the file sizes, the client request could be served via phone link or the request could be forwarded to the broadcast server which then loads the appropriate resources in the data broadcast queue for reception by the client.
A second model is that the client could have his or her Web browser connected to an Internet Service Provider (ISP) with the appropriate enabling technology to permit a multimedia connection to the Internet. In this scenario, the user is sitting at home surfing the Web. The person goes to a Web site where there are lots video objects. Until now, the Internet connection has been supported by a dial-up connection. When the person clicks on a 5-minute video news clip, the session software notes that this is a 3-Mbyte file and instead of pushing it through the 28.8 kbps phone connection, shunts it to a local broadcast station which loads it into the data broadcast queue. The large video file arrives at the client workstation in seconds instead of minutes.
Broadcasting is a universally available service, available to more of the U.S. population than any other telecommunications service. The infrastructure already supports a viable enterprise; any additional revenues coming from Internet-related business would be a value-added use of the infrastructure. This means new business ventures can begin sooner and have a shorter return on investment cycle because of a lower cost structure. The major costs are incremental and relate to client premises equipment (e.g., PC cards, set top boxes, etc.) and server environments (multimedia servers, telecommunications, Internet access, and data injectors to modulate the transmitted broadcast signal). The "network" infrastructure itself is largely built and economically viable on its own (e.g., the system of towers, transmitters, receiving antennas, etc.).

Digital broadcasting standard setting

The digital television standard for the U.S. will likely be that which the Advanced Television Systems Committee (http://www.atsc.org) finalized last fall for submission to the Federal Communications Commission (http://www.fcc.gov). Among other things, it specifies MPEG-2 for video compression and for the transport layer. The standard can support high definition television (HDTV), i.e., the Grand Alliance implementation), multiple standard definition television (SDTV) program streams, and so-called ancillary or private data applications. The data applications could be broadcast facsimile transmissions, multimedia pager data burst, HTML pages, etc.
The exact timing for the FCC's selection of a digital television or for that matter a digital radio standard is unknown. However, consumers can expect to have these services available in the near future. Both radio and television services will run parallel analog and digital services for a transition period. This means today's receivers will not be instantly obsolete but over time, consumers will need to buy new receivers, PC cards, etc.

Other standard-setting activities

There are several standard-setting activities which have the objective of supporting datacasting within both the current analog and emerging digital broadcasting infrastructures (http://www.nab.org/scitech/files/standard.htm). This includes developing standards for analog television (National Data Broadcasting Committee), digital television (http://www.atsc.org), and analog AM and FM radio.

National Data Broadcasting Committee (NDBC)

Even today's analog broadcasting signals can transport digitally encoded signals on subcarriers at competitive data rates. The capacity of television vertical blanking interval lines or active video lines is at least 500 kbps. This capacity will increase dramatically and can be more dynamically allocated after the transition to digital broadcasting. To help develop a standard, the National Data Broadcasting Committee was formed (for more information, visit NAB's Web site on this topic at http://www.nab.org/scitech/files/standard.htm).
The NDBC is jointly sponsored by the National Association of Broadcasters (NAB) and the Electronic Industries Association (EIA). The purpose of the NDBC is to develop voluntary national technical standards for high-speed data broadcasting using the NTSC television service as a delivery medium. Such standards are intended to lead to the development of a Data Broadcasting Service that will be used for over-the-air delivery of database information services to bring to the U.S. public the benefits of new broadcast technologies and services while, at the same time, preserving the public's ability to continue to receive existing services with present equipment.

High-Speed FM Subcarrier Committee

The standards work for radio revolves around radio data systems technology (for more information visit http://www.rds.org.uk/rdsfcontents.html) and high-speed FM subcarrier technology (http://www.nab.org/scitech/files/standard.htm). Radio is also transitioning to digital audio broadcasting (DAB). The DAB standard will be set by the Federal Communications Commission in the near future.

Multimedia broadcasting--some examples of early entrants

Intercast Industry Group

The Intercast Industry Group (http://www.intercast.org) is a nonprofit alliance of computer, television, cable, and online companies creating a "new" medium which combines television and the Internet on the PC. Their charter is to "develop technology, PC platforms, software applications, and content for broad market deployment." Their goal is to "create an industry accepted, open medium which will spawn industry-wide implementations." According to Intercast, their technology is a transport standard for digital data associated with TV signals.
Intercast technology (i.e., Internet + Broadcast = Intercast) is a means for transmitting HTML pages over broadcast television signals. The HTML pages can either be related to the traditional television programming or not. Links embedded in the HTML pages can be supported if the intercast-equipped PC also has a modem link for full interactivity.
For example, one implementation of this technology is that while a program or commercial is being broadcast, related HTML pages could be simultaneously broadcast to PCs. These pages could contain more details about the program, related information, scheduling information, plot summaries, or promotional material. For commercials, maps showing local dealers, full pricing information, current inventory stocks, or additional benefits can be presented in HTML pages.

WavePhore multimedia datacasting

WavePhore (http://www.wavephore.com/datacast/datacast.htm) is a public company (NASDAQ: WAVO) which uses different parts of the analog television signal to net a 384 kbps throughput. It is a partner in the intercast venture. The primary user platform is the PC. Its WavePhore Networks venture is part of the intercast alliance. In addition to analog TV, WavePhore can also use other transport infrastructure such as FM radio, satellite, microwave, or cable.

DirecPC

DirecPC, (http://www.direcpc.com) is a PC version of the direct broadcast satellite service from GM's Hughes Networks Systems division (NYSE:GMH). It is a data service using digital satellite signals for transport from the server to the client at up to 400 kbps. The client must have a DBS antenna (24 inches) installed and the necessary reception electronics, subscription service, and PC. Back channel or upstream capacity can be enabled via a dial-up connection. The service is packaged and marketed as sort of a cable version of the Internet with a basic subscription which includes news, financial, sports, and other information in multimedia formats. DirecPC has its own implementation of a datacast technology.

Datacast partners

Former NAB executive, John D. Abel, now heads up Datacast Partners http://www.databroadcast.com) based in Reston, VA. According to his estimates, a television station can equip itself for $50,000 to upgrade its facilities for datacasting. Datacast is planning a system test this summer using currently deployed (i.e., analog) television technology to provide a 700 kbps service. Datacast will use Digideck's technology which translates radio frequency (RF) signals to bitstream data for plug-in PC cards. Versions of the PC cards could also be incorporated into TV sets.
Abel's view of an appropriate match between broadcasting's transport strengths and content affinity is that a database which is relatively large and constantly changing is well suited to a datacast application. He provides the Yellow Pages as an example of a database that could be updated daily and rebroadcast for local storage. In addition to database services, news and entertainment information can be datacast using this technology. This service plans to derive revenues from advertising and data management fees.

Conclusion

The World Wide Web portion of the Internet has shown dramatic growth in both the consumer and business sectors over the past 12 months. Among the success factors for the Internet in general and the World Wide Web in particular are WinSock and TCP/IP. The public availability of these protocols led to an open systems environment providing an unprecedented level of connectivity. The ability to develop applications and content independent of underlying architectural layers is tremendously empowering. The marketplace can focus on content and applications without having to worry about the expense and risk of porting to different platforms and environments. Clearly, open systems and the ability to interconnect and interoperate are key success factors in the convergence marketplace. This is supported best by layered architectures such as the OSI reference model developed by the ISO.
What consumers value is content. This can be defined broadly to include traditional television program fare like "Roseanne," The Wall Street Journal site on the World Wide Web, and Internet catalog shopping. Consumers will increasingly assume they have the ability to selectively access, store, and control that content. They will also have a great desire to connect with content conveniently, regardless of their location or whether they are connected to a wire.
With all the developments in set top boxes, PCs, network computers, software applications, transactional databases, and various modes of distribution, we might pay attention to the marketing axiom, "Consumers don't buy drills, they buy holes." In other words, technology for technology's sake does not a market make. Now matter how fancy, fast, or powerful the drill, if it does not make the right size holes in the right materials, it is no good.
Furthermore, much of the World Wide Web data flows follow a typical client-server asymmetric pattern. This has implications for the role of transport and processing technology in the marketplace. In a world where bandwidth is rapidly becoming a commodity, mixed transport solutions which are cost-effective should be pursued. Companies do not need to build or buy huge bandwidth in symmetric networks when asymmetric architectures and topologies will work just fine.
In the short run, the Internet faces the challenge of meeting demand in a bandwidth-constrained world. Most people connect to the Internet via dial-up services or must share LAN-based connectivity. Downloading huge files is a constraint on network resources. This could prove to be a growth inhibitor for the Internet. Leveraging the natural advantages of the broadcast infrastructure with the industry's traditional and emerging role as content providers can lead to substantial value being added to the Internet marketplace. The broadcasting infrastructure adds a substantial wireless backbone to the Internet. It is universally available, low-cost, and has a large downstream data transport capacity which will grow in the transition from analog to digital broadcasting.
Already, even with analog broadcasting and digital satellite services, we can see the emerging business models which are taking advantage of this infrastructure to develop new ways to use the Internet. These businesses offer value-added content, actually add resources to the Internet at the transport layer and can interoperate with much of the processing level in terms of browsers, client-server architectures, and other interactive applications.
As for standards in general, the more dependent a business plan is on using a proprietary implementation rather than a government or industry coalition standard, the more risky the venture. Thus, while companies may have wonderful technology, to the consumer this is transparent or should be. It is content and applications that they really want. If a business plan relies on a closed system, the content and applications better be so good that it is worth the risk of going it alone.
The revenue models for the early entrants into the multimedia broadcasting and Internet marketplace encompass the range of possibilities - advertising support, direct subscriptions for services packages, usage fees (storage, compression, transmission), and access fees. Many of the Internet applications and sites today are premised largely on a "give the razors away, sell the blades" loss leader marketing philosophy. The objective is to get market share by developing a consumer appetite for these services in general and for a given company in particular.
The future of the Internet is contingent on how well the marketplace takes advantage of technology and policy to develop and execute business missions which serve marketplace demands. The scientific and educational imperative of early Internet culture has given way to an ethic of electronic enterprise. The Internet has breathed exciting new life into what the plain old telephone can do when used to connect computers to the Internet. So too can the radio and television infrastructure extend the reach of the Internet even further as it continues to penetrate the consumer marketplace.

source of http://www.isoc.org/

Related Posts

No comments:

Post a Comment