previous next

Chapter 6: Rendering Plug-In

After building a file format plug-in, you create a rendering plug-in that interprets the stream of RealSystem packets created by the file format plug-in. Using this rendering plug-in, the client can play back your datatype whether it is streamed from RealServer, pseudo-streamed through HTTP, or accessed from a local file. Content tools that need to reassemble RealSystem streams may also use your rendering plug-in.

The rendering plug-in takes the RealSystem packet stream and, for audio data, decodes and sends the data to the client's Audio Services interface. For video, image, and text data, the plug-in decodes the data and writes to a client window. A rendering plug-in can also direct the Web browser on the client to display URLs (hyperlinks) at specified times in the stream. This is called hypernavigation.

Rendering Plug-in

Design Considerations

In addition to the general plug-in design considerations discussed in "Designing a Plug-In", keep the following points in mind as you develop your rendering plug-in:

  1. Write your rendering plug-in to support the playing of several streams at once on the same machine.

  2. Carefully optimize the rendering plug-in for CPU performance. This helps reserve processing power for rendering other datatypes simultaneously, which is an important feature of RealSystem. As a guideline, check that your datatype performs well while synchronized with a RealAudio 16-Kbs stream on a Pentium 75. The RealAudio clip will use about 25% of the client CPU.

  3. As the client buffers the stream, the rendering window should display the datatype's logo. Design a logo suitable for rendering but let producers who create streaming content turn off the logo display through SMIL.

  4. Consider carefully whether producers will lay out content of your datatype in a browser driven by RealSystem or in RealPlayer using SMIL. RealNetworks can consult with you on the renderer user interface design.

  5. Content producers are concerned with providing a pleasing user experience on 640x480 and 800x600 screens, given that users often have big browser buttons and multiple toolbars. They need the ability to alter the rendering window size through SMIL and the stream. The rendering window therefore needs to respond rationally to size change requests.

Interfaces

A rendering plug-in typically implements the following interfaces:

A rendering plug-in typically uses the following interfaces:

Coding the Plug-In

The following sections explain how the RealPlayer and a rendering plug-in use the RealSystem interfaces to render data. The sample files included with this SDK illustrate many of these features. You can use these sample files as a starting point for building your own plug-in. Refer to the RealSystem SDK header files for more information on function variables and return values.

Note
The order of function calls listed in the following sections provides a generalized explanation and is for illustrative purposes only. Because RealSystem is asynchronous, your plug-in must be able to handle any call made to it while it is processing data or waiting for a response from another object. Do not code your plug-in so that it expects a specific sequence of events to occur as it interacts with RealSystem.

Starting Up

When the RealSystem client starts up, it loads each rendering plug-in:

  1. The client calls RMACreateInstance to create a new instance of the renderer. The client calls this method at start-up and each time it receives a stream to be rendered by the plug-in.

    Additional Information
    See "Creating a Plug-In Instance" for more on this method.

  2. The client calls the plug-in's IRMAPlugin::GetPluginInfo method, which returns descriptive information about the plug-in, including its copyright and "more information" URL. The bLoadMultiple attribute should be set to TRUE to enable the client to launch multiple instances of the plug-in in separate processes.

  3. The client calls IRMARenderer::GetRendererInfo, which returns functional information about the renderer:

Initializing

When the RealSystem client receives a stream, it identifies the appropriate rendering plug-in to use based on the stream's MIME type and the pStreamMimeTypes values returned by the rendering plug-ins during start-up. If two or more plug-ins handle the same MIME type, the client uses the first plug-in for that MIME type that it loaded during start-up. The following actions occur during rendering plug-in initialization:

  1. The client calls the plug-in's IRMAPlugin::InitPlugin method, passing it a pointer to the system context. The plug-in can use this method to peform any necessary initialization procedures. It should also use the context pointer to store a reference to IRMACommonClassFactory so that it can later create RealSystem objects used in rendering data.

  2. The client calls IRMARenderer::StartStream to give the plug-in access to the client through IRMAPlayer, as well as to give it access to the stream to be rendered through IRMAStream. Within IRMARenderer::StartStream, the renderer can perform any other initialization functions based on its supported features:

  3. The client calls IRMARenderer::OnHeader to pass the renderer a pointer to the stream header object created by the file format plug-in. The renderer uses IRMAValues methods to retrieve the header data and then releases the object.

  4. The client calls IRMARenderer::GetDisplayType to get the renderer's preferred display type. The renderer returns RMA_DISPLAY_WINDOW if it renders data on the screen or RMA_DISPLAY_NONE if it does not use a screen because it is, for example, an audio renderer.

Rendering Streams

The client and rendering plug-in use the following methods to render a data stream after the renderer has been initialized:

  1. The client calls the renderer's IRMARenderer::OnBegin method to inform the renderer that playback has begun or has resumed after a pause. It passes the renderer the stream's timeline value in milliseconds. This is zero if the stream is just starting. When resuming after a pause, the client passes an integer value that indicates how many milliseconds into the stream timeline to begin the playback.

  2. The client calls IRMARenderer::OnPacket each time a packet is ready (or should be ready but is lost), passing it a pointer to the IRMAPacket object to be rendered. The method includes the packet's time offset from the start of the stream.

  3. Although the client ensures that packets are delivered in order and resent if necessary, it cannot guarantee that packets are not lost. Before attempting to process the packet, the rendering plug-in calls the IRMAPacket::IsLost method to determine if the packet has been lost. The rendering plug-in is responsible for taking the appropriate action to handle the packet loss.

  4. The client calls IRMARenderer::OnBuffering to inform the renderer of data buffering. The method includes the reason for buffering and the percent complete.

  5. The client calls the renderer's IRMARenderer::OnTimeSync method periodically depending on the value of unInitialGranularity returned by the renderer on client start-up. The method returns the current playback time. The renderer uses this information to synchronize playback of its stream with the presentation.

    Additional Information
    See "Timing and Synchronization".

  6. The rendering plug-in renders the data as necessary. This can include the following:

    The renderer can render data in IRMARenderer::OnPacket or store packets until IRMARenderer::OnTimeSync is called. (Note that because of COM's automatic support for object lifetime, storing packets does not require copying the data.) Renderers that use Audio Services, for example, can render in IRMARenderer::OnPacket and let Audio Services perform the timeline synchronization. Video renderers typically render in IRMARenderer::OnTimeSync.

  7. The renderer calls IUnknown::Release when finished with the data. The renderer provided with the SDK sample code, for example, keeps only the last packet.

Seeking

During a presentation, the user may use the client's seek function to move the presentation to a new point in its timeline. In this case, the client performs the following actions:

  1. The client calls IRMARenderer::OnPreSeek, passing the last stream time value before the seek and the new time value for when the seek completes.

  2. The client calls IRMARenderer::OnPacket and IRMARenderer::OnBuffering to pass the renderer any buffered packets that post-date the seek action. Although the renderer should not render this data, it has access to the packets for any purpose. It can simply release the packets if it has no need for them.

  3. When the seek completes, the client calls IRMARenderer::OnPostSeek. The call includes the time in the stream timeline when the seek occurred, and the new timeline value following the seek.

  4. The client calls IRMARenderer::OnPacket, IRMARenderer::OnBuffering, and IRMARenderer::OnTimeSync to pass the renderer packets for data beginning at the new point in the presentation timeline and provide synchronization information.

    Additional Information
    See "Timing and Synchronization".

Pausing

The client performs the following actions if the user pauses the RealSystem presentation:

  1. The client calls IRMARenderer::OnPause, providing the renderer with the stream's time value in milliseconds just before pausing.

  2. The client calls IRMARenderer::OnBegin to inform the renderer that playback has resumed after the pause. It passes the renderer an integer value that indicates the stream's time value in milliseconds after the pause.

Hypernavigating

Hypernavigation occurs when a rendering plug-in directs the client to display a URL at a specified time in the stream. When the plug-in issues a hypernavigation request, the default Web browser opens. If the browser is open already, the target URL displays in the current window. The plug-in can also specify that the URL display in a specific frame of the current browser window.

A rendering plug-in hypernavigates with IRMAHyperNavigate::GoToURL. The function takes two parameters, a fully qualified URL and a frame target (NULL for no frame target). The following sample code shows a hypernavigation request that does not target a frame:


m_pHyperNavigate-GoToURL("http://www.real.com", NULL); 

Sending BackChannel Packets

If the renderer's corresponding file format plug-in implements IRMABackChannel, the renderer can send the file format plug-in data in an IRMAPacket. This can be any feedback or control data that is opaque to the RealSystem architecture and is necessary for plug-in operation.

The rendering plug-in queries for IRMABackChannel on the IRMAStream interface. It then calls the IRMABackChannel::PacketReady method to pass the file format plug-in a pointer to the packet. Note the following, however:

Terminating the Presentation

The following actions occur at the end of a presentation:

  1. The client informs the renderer that all packets have been delivered to it by calling IRMARenderer::OnEndOfPackets. The user may still seek backwards through the stream, however, so the renderer should not deallocate resources at this point.

  2. When the stream ends, the client informs the renderer by calling IRMARenderer::EndStream. The renderer then deallocates resources as necessary. At this point the renderer can no longer access the stream object, but it can still paint to a window and handle interaction such as mouse clicks.

  3. The client destroys the renderer object when a new stream begins.

Accessing Stream and Player Objects

A rendering plug-in has access to several objects that it can use to gather stream information and affect the presentation. During initialization, the client passes the rendering plug-in pointers to the player and stream objects. The renderer can then use IRMAPlayer to access client player functions (see "Chapter 20: Top-Level Client") and IRMAStream to get stream information (see below).

Using the Stream Object

The IRMAStream interface gives the rendering plug-in access to the stream object for a stream it is rendering. The following methods let the plug-in get information about the stream as necessary:

The following IRMAStream methods allow the renderer to report or modify the level of service:

Using the Stream Source Object

A rendering plug-in can call IRMAStream::GetSource to get a pointer to the stream source object, which is the file object from which the rendered stream or streams are generated. The renderer then has access to the following IRMAStreamSource methods:

Rendering Images on the Client Display

For RealSystem client to display images, a renderer must draw them on the client's target display area. The renderer can do this using platform-specific functions (making it a "Windowed Renderer"), or it can use platform-independent RealSystem functions (making it a "Windowless Renderer").

Additional Information
See also "Chapter 12: Sites (Windowing)".

RealSystem functions are much more convenient than platform-specific functions.To use platform-specific functions, for example, the renderer must handle the PAINT message appropriate for each platform. To use RealSystem functions, however, the renderer only needs to handle the cross-platform RMA_SURFACE_UPDATE message.

The RMA client core provides this device-independent windowing functionality to renderers. Each drawing area owned by the client is supplied as a "site." To send display data to the site, the renderer registers as a "site user."

The site's drawing area, or "video surface," is made available to the rendering plug-in through a IRMAVideoSurface interface. This interface supports a variety of image formats and is uniform across all platforms.

The renderer, in turn, implements the IRMASiteUser interface so that it can receive RMA event messages (RMA_SURFACE_UPDATE and others) from the client, and so it can attach, monitor, and detach sites.

The client core also implements the IRMASite2 interface, and the renderer calls the IRMASite2::GetVideoSurface method to get a IRMAVideoSurface interface for the site. This IRMAVideoSurface interface, also implemented by the client core, is used by the renderer to:

In addition, the renderer must implement the IRMASiteUser interface, since its methods are called by the client whenever the renderer must respond to events, either the RMA_SURFACE_UPDATE message (the surface must be redrawn) or mouse events. The client also uses the IRMASiteUser interface to attach sites to the renderer as needed for new image data streams, and to detach sites when rendering is complete.

Once the rendering plug-in has obtained the IRMAVideoSurface for the client site, it is ready to draw images. To do this, the renderer:

  1. Calls IRMAVideoSurface::BeginOptimizedBlt to set the drawing format, such as the size, compression type, and so on.

  2. Calls IRMAVideoSurface::OptimizedBlt on the video surface, passing the image data to be displayed. The RealSystem core draws the bitmap on the client's screen.

  3. Responds as needed to the RMA_SURFACE_UPDATE event message, or any of the various RMA_MOUSE_EVENTS (see rmaevent.h)

  4. Calls IRMAVideoSurface::EndOptimizedBlt when the drawing is complete.

To get a better understanding of rendering to RealSystem's RMAVideoSurface, study the sample rendering plug-in located at samples/intermed/exvsrndr/exvsrndr.cpp. This example clearly demonstrates the use of the video surface and its supporting interfaces. You will also want to refer to rmavsurf.h, rmawin.h, and rmasite2.h.

Providing Recording Information to the Client

A RealSystem client can record all or part of a presentation for local playback on the client machine. To support this feature, the renderer must implement IRMAPacketHookHelper and provide packets for recording to the client. In some cases this may simply mean handing the packets it receives to the IRMAPacketHookHelperResponse interface. In most cases, though, the renderer must manipulate the data and create new packets. This is because the recording may start after the presentation has begun and hence have a different timeline.

Additional Information
See "Recording a Presentation".

  1. The client core calls IRMAPacketHookHelper::StartHook when the recording begins. This method passes the renderer a pointer to the response object, gives it the number of the stream being recorded, and provides the timeline offset.

    The timeline offset is the number of milliseconds into the presentation timeline that the recording began. If the value is 1000, for example, the recording started one second after the presentation began. The renderer is responsible for adjusting the packet timing information to this offset so that the recorded presentation's timeline begins at the offset value.

  2. The renderer starts creating IRMAPacket objects that contain the presentation data, as well as values for the stream number and delivery time. The renderer is responsible for sending any necessary data that arrived before the recording started. This may include, for example, video keyframes or vector layout information.

    Additional Information
    The file format plug-in section discusses packet creation. See also "Using IRMAPacket to Create Stream Packets" for the basics of packet creation.

  3. For each packet, the renderer calls IRMAPacketHookHelperResponse::OnPacket to pass the response object a pointer to the packet object.

  4. The renderer continues to deliver packets until one of the following occurs:

Timing and Synchronization

In RealSystem, the client sends time synchronization intervals to renderers on a per renderer rather than per stream basis. This is because RealSystem supports container datatypes in which multiple renderers read from the same source. On start-up, each renderer requests a time synchronization interval in milliseconds using the unInitialGranularity attribute. The client then calls IRMARenderer::OnTimeSync each interval.

Note
Audio Services, when used, sets the interval to the shortest interval requested by any rendering plug-in. These services are explained in "Chapter 13: Audio Services".

The timestamps assigned to packets are relative to the start time of the stream. Packet times are adjusted by the stream start time and the preroll:


delivery time = packet time - stream preroll + stream start

For example, if stream 1 has a preroll of 3000 milliseconds and a start time of 180000 milliseconds (3 minutes), and the first packets are time stamped 0, 500, and 1000 milliseconds, these packets are delivered to your rendering plug-in at 177000, 177500, and 178000 milliseconds

How the rendering plug-in renders the data is a detail of your datatype. You can implement a scheme whereby you pass the render times down with the packets. Or, you can use some offset from the delivery time. Suppose that you have two distinct data items, item A and item B, in your presentation. Item A is to be displayed 5 seconds into the presentation and item B is to be displayed 10 seconds in. The information that Item A needs to be displayed at 5 seconds is something you should include in your datatype packet payload just as you need to store information about "packetization," such "this packet is part N of M of Item A." You have two options; you can use the "delivery time" values and set a preroll to give your plug-in computation time. Or, you can place rendering time information in your opaque data.

"Duration" for non-time-based object streams, such as still images, is also up to your datatype. A timeline-oriented tool would allow the user to specify the image display time. A powerful tool would allow the user to specify that an image display after a second image has been downloaded. You could, for example support downloading several images while the first one is displayed, then quickly flip through the downloaded image. This is all specific to your datatype implementation.

Modifying the Rendering Plug-in Sample Code

The RealSystem SDK includes sample rendering plug-ins that you can use as a starting point for building your own plug-in:

Perform the following steps to change the intermediate or advanced sample renderer code. These steps assume your company name is "Foo Bar, Inc.", your file extension is .foo, and the MIME type of your data stream is application/x-foobar.

  1. Copy the sample code from the samples directory to a working directory. Change the file name and class names to match your file format name. For example, if you are implementing a Foo data type you might replace all occurrences of CExampleRenderer with CFooRenderer, and rename the files foorendr.cpp and foorendr.h.

  2. Change the plug-in description, copyright information, and more info URL stored in zm_pDescription, zm_pCopyright, and zm_pMoreInfoURL. For the Foo example, you could change the values as follows:
    
    char* CFooRenderer::zm_pDescription    = "Foo Rendering Plug-in";
    char* CFooRenderer::zm_pCopyright = "(c)1997 Foo Bar";
    char* CFooRenderer::zm_pMoreInfoURL = "http://www.foobar.com";

  3. Change the stream MIME types stored in zm_pStreamMimeTypes. For the Foo example, you could change the values as follows:
    
    char* CFooRenderer::zm_pStreamMimeTypes = "application/x-foobar";
    

  4. Process the RealSystem packets and render the data as necessary.

  5. Compile, debug, and test your plug-in.

    Additional Information
    "Compiling a Plug-In".


Copyright © 2000 RealNetworks
For technical support, please contact supportsdk@real.com.
This file last updated on 05/17/00 at 12:50:18.
previous next