March 93 - An Object-based Direct Broadcast Satellite System
An Object-based Direct Broadcast Satellite System
G. Gordon Apple
After approximately a century of radio and a half-century of television, broadcasting is about to enter a new era - an era of data distribution, software, compressed video, interactive elements, distance learning, intelligent agents, and inexpensive program origination. Reception will be available nationwide from the first day of service and will include customized, subscription, impulse-access, proprietary, and public programming. OOP techniques will be a fundamental component of the delivery system, system management, user interface, and a large number of new types of broadcast services not previously possible.
This and other processor-based communications systems will result in a fundamental paradigm shift in the software industry from single static applications to smaller components: standardized and custom dynamic, interactive, intelligent, and even disposable components. Emphasis will increase on robust software and on system design that can handle diverse and anomalous conditions.
Some of the history of this system is discussed along with the evolution that occurred from dedicated analog hardware to object oriented software. Aspects of this particular system are described and example software implemented in C++ MacApp is discussed.
The subsystems and techniques described here are broadly applicable to a variety of computer-based communications systems but are primarily intended for use through ACC's1 Direct Broadcast Satellites (DBS) within the next 24 to 36 months. DBS service will allow reception with antennas as small as 18 in. anywhere in the nation. With no dependence on local distribution companies, it will be the only truly national broadcasting service. ACC's DBS will be an integrated broadcast system and will allow reallocation of digital transmission capacity between transponders and within different uplinks using the same transponder.
ACE's system design draws on a variety of disciplines including video compression, dynamic multiplexing, high-speed demodulation and re-synchronization, remote uplink synchronization techniques (similar to that used on some military satellites), error correcting codes, live on-line object-oriented databases, encryption, indexed indirect service access, and object oriented multimedia software. As such, it will be able to deliver the usual compliment of cable TV video fare, digital radio, wide-screen video, HDTV and computer file downloads, but also many new types of object-based digital broadcast services.
Educational course broadcasting will be a major beneficiary of the OOP approach. These combined techniques will allow course delivery to be two to three orders of magnitude more efficient, less costly, and more service-prolific than would otherwise be possible. For example, the two transponders being donated by ACC to the Foundation for Educational Advancement Today will be able to simultaneously carry literally hundreds of courses where only two would have been possible previously, and 10 to 20 using only video compression. At the same time, the provided video quality will be vastly superior because much of it will be generated locally by the receiver processor from received data and software objects and can be displayed on as normal or HD TV screen or on a personal computer with a non-interlaced screen.
In addition, a new multimedia OOP element will make local interactivity possible in the broadcast environment. Software objects used for generating screens and animations will allow user interaction, user individualization, automated response collection, and return link communication by telephone line modem, VSAT, cellular systems, etc. (see Figure 1). InterBroadcast Satellite System activity can range from a simple screen button that beeps to animated coloring books for children and medical students (want to see my brain anatomy coloring book?) to sophisticated laboratory experiment simulations. The persistence of these objects can be controlled absolutely or conditionally. They can be forced to self-destruct after a specified time, be copy controlled, or allow copying and off-line use. They can also be used in conjunction with locally available required or optional resources such as course-specific or general-reference CD-ROMs.
Needless to say, success of this and related systems will open vast new markets for new types of software, software components, authoring tools, and content providers. Of necessity, emphasis initially will be on completing development of the system, development of authoring tools and demonstration software. However, a goal is to make it possible for a not-terribly-computer-literate teacher or professor to be able to do a live nationwide course broadcast without ever leaving the office. A personal computer, pen pad, and an inexpensive TV camera should be all that is minimally required. A LAN could connect to the campus or office-complex uplink or an ISDN link could connect from a home office. This could totally eliminate expensive studio facilities, equipment, and trained personnel presently required for most distance learning.
What I hope to accomplish here is to give you a some indication of both the importance of object oriented programming as an integral element of this system, and to indicate the directions that I believe the software technology and markets will be going as a result of these new interactive capabilities and distribution channels.
In the Beginning
The evolution of this broadcasting system from analog TV hardware to object oriented software mostly follows my own career. At Purdue, several of us were working on various aspects of digital signal processing, compression, transmission coding, and error correction coding. We would regularly get thrown off of the campus CDC 6600 after monopolizing it for hours at a time doing transforms (precursors of JPEG) on images. On a NASA research contract we built what I now believe to be one of the first digital signal processors and maybe the first transform codec. We were also working with digital transmission. But with T1-carrier becoming more available, in our collective wisdom we agreed that there was no future in modems. (Kick, kick.)
A few years later at Bell Labs we developed for Picturephone what was probably the first commercially-targeted video compression codec using intra-frame and frame differential conditional replenishment techniques. Transform techniques were still out of the question by at least several orders of magnitude in cost and capability. However, one of the significant techniques that evolved out of this project was that of dynamic multiplexing, a technique that would later be extremely important in OOP-based DBS.
CBS, TRW, and DBS
Skipping ahead a few years past one of the first digital telephone switching machines, the Space Shuttle, and secure voice terminals, CBS was one of the first-round DBS applicants. At that time and even today, virtually all video transmission was (and is) analog. I was at TRW at the time (circa 1980) and we did a joint study to determine the feasibility of using DBS to distribute HDTV, particularly using digital transmission. At that time we were considering three transponders (channels) per satellite covering one time zone and received with a 1 meter dish. A complete system design was generated for a digitally compressed HDTV transmission system using a mode-comparison-switching approach combining various differential compression techniques that could be implemented at high speed and used a relatively simple receiver decoder. Again, transform techniques were still two or three orders of magnitude too computationally and economically expensive. Even with the simpler approach, digital transmission was still thought to be too expensive to implement.
During the CBS project was when I started to seriously consider the ramifications of a universal digital broadcast system and promote its utility. As part of the project, at an oral presentation to CBS headquarters, I tried to convey my excitement for the concept that digital broadcasting had implications that went far beyond standard television or even HDTV. It was soundly dismissed as just being more video-text in which they had recently taken a financial bath. I protested that we were talking about 10,000 times the delivery capacity of the toy that had been tried, but it fell on deaf ears. The cable and VCR boom was just beginning - and the Macintosh was yet to rise from the ashes of the Lisa. CBS completely pulled out of DBS, leaving a vacuum that was to later pull me in again.
Back in my home town of Little Rock, a former high school class-mate had been a second-round applicant for a DBS license under the name of Advanced Communications Corporation and was successful partly because of the CBS pull-out. When I told him what I had in mind, he was all ears (with apologies to Ross Perot). Although investment money was not readily forthcoming, we filed our plans with the FCC. ACC became the first company in the world to commit to digital broadcasting and steadily increased its orbital holdings to where now it is one of two main DBS license holders.
Around 1985 I started developing the concept of broadcasting to computers (including voice channels) as a means of very inexpensive, low-bit-rate, live course distribution. I also recognized that with frame differential coding, video courses which consisted mostly of stationary slides, blackboards, etc. could sustain much higher levels of video compression, especially in a dynamically multiplexed environment where high bit-rates could be delivered in short bursts when needed. To promote such educational uses, with the help of the (now late) Hon. Wilbur D. Mills, the Foundation for Educational Advancement Today was formed and its operational arm was dubbed "Your Educational Services Networks". Yes, Bill Clinton and his people are aware of it and the present Governor has seen some of our Macintosh presentations and demos, for all that's worth. Mills even got us in for a presentation to President Bush, but that was even less productive. Well, nobody said either life or innovation were going to be easy.
Advanced Communications Engineering, Inc. was formed to do systems engineering, hardware design, and software development for what we could see was a new upcoming broadcast market. Having been Assistant Program Manager for systems engineering on a flexible DSP-based Milstar2 ground terminal, I was well versed in flexible modulation and real-time signal processing techniques. I had also brought PCs into that program to simulate various equipment user interfaces. It was becoming obvious to us that computers and video were on converging courses and that the two separate types of services that we had been envisioning would merge. We had already planned to broadcast interactive software, but OOP was not yet part of it.
I got involved with the Macintosh and decided to learn OOP. I quickly recognized that OOP, its structure, its polymorphism, and its messaging model could provide much of what I had been looking for in reusable, flexible, and disposable components and could provide encapsulation of many other functions. A flexible modem simulation program that I had originally done on the PC was direct-ported to the Mac and later updated to be a full fledged Mac Program. We also converted a PC DSP card to Nu-Bus for use on the Mac and a graphical block-linked interface was done in MacApp.3 It was this project that made me fully grasp the power of using OOP to encapsulate both hardware and software functionality. I only wish I had known about OOP when having discussions with RADC4 and others about the need to do military communications systems initial designs with tools that were independent of hardware/software decision tradeoffs.
For the DBS system we had already planned to transmit software, drawing primitives, and commands to implement the planned features, especially the course broadcasts. It didn't take long to figure out that OOP was a natural for this system. A reasonably rich set of known object classes could be built into receiver ROMs and be augmented by downloads when needed. The dynamic nature of pointer-based objects would allow objects to be queued up and transmitted in advance of when they are actually needed. Code modules for new object types could also be downloaded and dynamically linked to existing code. These could then be flushed out of the receiver when no longer needed, freeing receiver memory for new incoming objects and code.
However, there are significant differences between a closed-loop computer system and an open-loop broadcast system. Ideally, the transmitter would know the basic state of the receiver, what objects it contains, and how much receiver memory is available. As a practical matter, transmission could be interrupted, object data and commands could be corrupted by bit errors, users would not always be on-line at the beginning and might want to switch between on-going services, and worst of all, there is no quick-response return path for the source to know what is happening at the destination receivers. Also, when was the last time your television crashed? We believe we have acceptable solutions for each of these issues, but it will take real field testing to make sure that all the problems have been addressed and adequately solved.
There is a lot of interest today in networks, cable, fiber, and on-line multimedia server systems. The nationwide capital investment required for these systems is enormous, in the hundreds of G$. It will also take 20 years or more to become available to 60-70% of the population. In contrast, ACC's DBS can be universally available within two years at a cost of a few hundred M$, approximately 1000 times less than alternatives. We could afford to spend some of those excess G$ on software programmers and content providers.
System and software
We may not be the first DBS system to go into operation, but we believe that basing the system on OOP techniques, having a strong educational emphasis, and having a clear vision of a new broadcasting era will allow us to do it right. What we are creating is a fundamentally new type of broadcast system.
Advances in satellite technology, launch capacity, demodulation, error correction circuitry, and particularly receiver noise figure have now made it possible for two satellites to provide 32 transponders of 200W each (compared to about 10W for most satellite transponders), half or full US coverage, 30-40 Mbps per transponder, and allow 18 in. receiving antennas. A combination of digital video compression and the software techniques described here will allow these 32 transponders to carry a mix of literally many hundreds of programs (i.e. channels) compared to the original plan of three per satellite per time zone back in the original CBS days. Look at it this way - the 32 transponders in one orbital location provide an aggregate of approximately 1 Gbps of available services to anywhere in the nation. Think of the AppleLink charges you can save reading Bedrock3Tech$.
Our job as a service provider will be mainly to manage delivery of a large number and mix of fixed, variable data rate, and bit-packet services. Live on-line output can be provided through the receiver to computers and other devices. A goal is to keep the delivery system as open and flexible as possible to accommodate third party services.
Most systems must uplink all of one transponder's signals from a single location. This system uses proprietary techniques to allocate a transponder's capacity between diverse uplink locations. That is of particular advantage when using lower-data-rate OOP-based educational broadcasts. However, statistical (dynamic) multiplexing efficiency is best when several services are combined before uplinking. For example, a college campus could combine several courses on an Ethernet LAN or local fiber and use a single uplink into a portion of one transponder.
Because of the highly flexible nature of the basic service, there are many aspects of system control that require sophisticated software, toolboxes, basic libraries, an extensive continually updated local data base, a user-friendly OOP-based GUI, a relatively powerful control processor, and a considerable amount of RAM. Changes in configuration and services require the ability to download new information, code resources, and commands. New code modules must be dynamically linked, often on the fly. Sometimes, object types of new objects will be known. Other times, object types will be new and will require transmission and linking of code modules. Memory management and reliable operation are of utmost importance.
It is not possible to even begin to describe the new service possibilities in this article. CDI, TV Answer, a myriad of educational software programs, and the recent 3DO demos at CES give a tiny glimpse of the possibilities. The important thing is that inexpensive locally interactive broadcast capabilities will open new markets for inexpensive, widely disseminated, disposable software, i.e., the stuff that OOP, C++, maybe Bedrock, Taligent, and Kaleida are about.
A MacApp Example
The system itself will be able to do most of the described functions using only the ACE TV receiver. However, specific services can be offered for computers such as the Macintosh, PC Windows machine, or PowerPC.
One goal of the MacApp program, Real-Time Presenter, was simply that of our own education, learning how to use an object oriented framework, inheritance, message routing, and encapsulation. MacApp 2.0 (I'm really not that familiar with 1.0) made considerable use of inheritance, sometimes to excess, as apparent in the TView derivatives. MacApp 3.0 introduced some very important innovations such as "behaviors" which introduced a standard software event messaging bus architecture analogous to what the hardware bus has done for processor components. Functionality can be plugged into or unplugged from the bus at will, and if designed properly, can be done independently of other items.
HyperCard also has much appeal, mainly because of its scripting and elimination of dealing with the infamous compiler/linker cycle. Of course, for this convenience we paid a dear price in performance. Another innovation of HyperCard was the Xcmd. It provided a standard interface for addition of code modules that could greatly expand HyperCard's capability. The interesting thing is that this spawned development of software on the other side of the interface so that Xcmds designed for HyperCard could be used with other programs, Real-Time Presenter being one of them. HyperCard has been reabsorbed by Apple Computer and will be integrated with AppleScript. It is also rumored that version 3.0 will be a complete rewrite in C+. This bodes well for HyperCard's future utility in this type of broadcast environment.
I am not going to go into details of Real-Time Presenter here because I cannot do it justice in one article. Also, there has been some discussion of serializing it to illustrate MacApp programming techniques. I will discuss how it relates to the broadcast system.
The program is designed as a presentation program but will be carried to several levels. The intent is to initially marketed it as a stand-alone presentation program. Its ultimate application is in the DBS broadcasting system. In the two cases, its functionality will seem identical to the presenter and the presentee. However, its internal use of the delivery system will be radically different, transmitting software objects in place of most video.
The program is organized into four levels (see Figure 2). The "Session" is analogous to the Document and is made up of "Segments" which are made up of "Clips". Clips are screens that can be as simple as a fixed slide or a scrolling blackboard or as complicated as an interactive laboratory experiment simulation or a spreadsheet program. All clips have a background layer (usually a PICT, solid color, or shader) and an annotation layer. Multi-layer clips are made up of components such as drawing layers, PICTS, colors, MacroMind Director animations (using the MM Player and the Player Xcmd), etc.
At each level from Session to Clip there are complete list manipulation capabilities including disjoint multiple selection, cut, copy, paste, duplicate, clear, and drag-and-drop reordering.5 There are drawing6, text7, and manipulation tools. One feature that has recently been added is the ability to drag-copy objects from one window to another. Navigation palettes also allow moving through and editing the lists (see Figure 3). The intent is for the presenter to have at least two monitors, the public presentation monitor and the presenter's private screen. The private screen might be on a PowerBook and the presentation screen on an external projector. The program can do more than one presentation at a time if multiple monitors are available.
A video control palette (see Figure 4) is available and is designed for quick selection of video size and position by clicking on preset frames. When active, live video overlays whatever else is on the screen. A MacApp object class presently supports all features of the RasterOps 364 and 24STV boards. We hope to include the Radius VideoVision soon. Images can be grabbed to the clipboard in any selected bit-depth from 1 to 24-bits and pasted into any presentation screen.
A key feature of the program is the ability to plan and monitor presentation progress. Time can be estimated for each component, clip and segment so that the entire session can be planned to fit the allocated time (e.g., one hour on a satellite link). Progress is monitored and visual feedback is given in real-time to the presenter along with alarm status if dangerously off-schedule.
Although the program can be used for presentation on a computer monitor or data projector, it can also be used for standard TV video transmission whether using analog or compressed digital transmission.
Internal Differences When Transmitting Objects
The program is designed to present screens in a fixed sequence order, although deviation is allowed. Each clip is a software object and each layer of the clip has a "component" object. These objects can be indexed (given a reference number) and uniquely referenced. In MacApp 3 we already have a standard digital method for handling these objects, namely the streams methods that we use to store and recall them. If desired, we could make a new streams class whose purpose would be to attain maximum bit-use efficiency during transmission.
Objects can be time stamped and queued for low-priority (can you say "cheap") space-available transmission. If the presenter dallies excessively, some objects may become stale and have to be retransmitted to placate the latecomers and channel flippers. Commands can be intercepted and sent in real-time to the receiver. A video rectangle can be sent when required to indicate screen location of live video. Compressed video and audio will be transmitted with minimal delay. The basic scheme can be further refined but this should convey the general approach.
If an object resource is not present in the receiver buffer (e.g., an airplane blocked the link) the basic presentation program at the receiver should be robust enough to do something benign. Each object will have an integrity verification check sum.
What we have done is eliminate the middle men, the QuickDraw calls, the screen bitmap (pixmap), the image coding, image decoding, etc. We let the receiver do most of the work of image generation. That is why much course presentation material can be sent more efficiently than with video compression and still generate presentation screens that are superior in image quality. The same is true for animations and other processor generated images.
How it Would be Used
An instructor could develop a format of Segments for a normal class. The opening banner would identify the course. Some segments might be formal and use prepared materials, some could be free-format for call-in questions, some might be informal and use the sketch pad during discussion. To keep transmission charges to a minimum, partial-screen video could be limited to the beginning, the end, and occasional use during the class.
We wish to initially give instructors familiar tools that do not require radical changes in instruction techniques. This way, they can ease into use of newer tools such as interactive math graphing and simulations. An instructor can demonstrate a software module and then hand it off to the remote students for local or off-line use.
An additional advantage of generating the image or at least part of it locally in the receiver processor is that it can be done intelligently. Interactive elements can be added and can be made to depend on the local user data base. Games, exams, homework, or laboratory simulations could be included. In the case of customized interactive advertising, user data can be internally collected in the receiver over a period of time without violating privacy laws. Features can be randomly activated by device serial number and/or user response. ("Congratulations! You have won a case of Pepsi.") A screen button could be preset to automatically call an order to a local supplier with the correct clothing size and charge card number from the local data base. All the user would need to do is click "OK" and verify.
Obviously, the receiver memory is finite, but will get less expensive and more plentiful with time. Management of memory resources and receiver object data bases will be very important, but should be transparent to the program originator and user. An educational program may require additional RAM which can be obtained by installing a PCMCIA card or by purging other data. Many receiver options will be available. If the user wants QuickTime previews of available movies, he may have to add a hard disk.
To put things into perspective, a decade and a half ago anyone suggesting that people would actually buy software would have been directed to the loony bin. Since then we have seen the rise of the dedicated software application. Through the use of OOP, standards, frameworks, and digital communications systems, we are about to enter an age of more fluid software, i.e., software that serves a purpose and is then gone. ("Who was that masked app anyway? It left this silver-bullet utility.") DBS object transmission could play a major part in the evolution of future software development.
The (C++ MacApp) Real-Time Presenter program is an operational framework to encapsulate and control a large number of software presentation modules. New modules can be added and existing ones can be encapsulated and controlled. Demo versions of other software could be downloaded and used for training. License free run-time versions of other programs could also be included. Although many software objects and modules will be flushed through the system and disposed when finished, the basic objects can be used over and over again in the same or different context. "Disposable software components" does not mean "throw away" software any more than television means "throw away" movies. But it is certainly a shift in thinking from static applications to dynamic software components.
- Advanced Communications Corporation is one of two FCC-licensed companies authorized to operate up to 27 high-power satellite transponders in the exclusive DBS frequency band and having an orbital location (110 deg. W) allowing full continental US coverage.
- A military communications satellite system which is undoubtedly the most sophisticated ever designed or built.
- G. Gordon Apple, "Digital Signal Processing". MADA Conference, Phoenix, 1991.
- Rome Air Defense Command (Rome, NY).
- G. Gordon Apple, "Managing Named-object Lists", FrameWorks, Jan/Feb, 1993.
- G. Gordon Apple, "A Clear View and the Ten Behaviors", FrameWorks, February, 1992.
- G. Gordon Apple, "A Simple TextEdit Behavior", FrameWorks, September, 1992.