After building a file format plug-in, you create a rendering plug-in that interprets the stream of RealSystem packets created by the file format plug-in. Using this rendering plug-in, the client can play back your datatype whether it is streamed from RealServer, pseudo-streamed through HTTP, or accessed from a local file. Content tools that need to reassemble RealSystem streams may also use your rendering plug-in.
The rendering plug-in takes the RealSystem packet stream and, for audio data, decodes and sends the data to the client's Audio Services interface. For video, image, and text data, the plug-in decodes the data and writes to a client window. A rendering plug-in can also direct the Web browser on the client to display URLs (hyperlinks) at specified times in the stream. This is called hypernavigation.
In addition to the general plug-in design considerations discussed in "Designing a Plug-In", keep the following points in mind as you develop your rendering plug-in:
A rendering plug-in typically implements the following interfaces:
IRMAPlugin
. Header file: rmaplugn.h
. Every plug-in implements this interface, which RealSystem uses to determine the plug-in's characteristics.
IRMARenderer
. Header file: rmarendr.h
. All rendering plug-ins must implement this interface, which handles header and packet reception, as well as stream status information.
IRMASiteUser
. Header file: rmawin.h
. A display renderer typically implements this interface, which the client uses to associate the renderer with a site object and inform it of events.
![]() |
Additional Information |
---|
See "Chapter 12: Sites (Windowing)". |
IRMASiteUserSupplier
. Header file: rmawin.h
. This interface is used to get instances of the objects that use sites. A rendering plug-in or the client can implement this interface.
IRMAPacketHookHelper
. Header file: rmaplugn.h
. This interface instructs the plug-in to send packets used to record the presentation on the client computer.
A rendering plug-in typically uses the following interfaces:
IRMASite2.
Header file: rmasite2.h.
This interface is implemented by the client core and is used by the renderer to obtain a reference to the client's video surface.
IRMAVideoSurface
. Header file: rmavsurf.h
. The client core implements this interface, and the rendering plug-in uses it to draw on the client's display. This interface offers device-independent methods for image rendering.
The following sections explain how the RealPlayer and a rendering plug-in use the RealSystem interfaces to render data. The sample files included with this SDK illustrate many of these features. You can use these sample files as a starting point for building your own plug-in. Refer to the RealSystem SDK header files for more information on function variables and return values.
When the RealSystem client starts up, it loads each rendering plug-in:
RMACreateInstance
to create a new instance of the renderer. The client calls this method at start-up and each time it receives a stream to be rendered by the plug-in.
![]() |
Additional Information |
---|
See "Creating a Plug-In Instance" for more on this method. |
IRMAPlugin::GetPluginInfo
method, which returns descriptive information about the plug-in, including its copyright and "more information" URL. The bLoadMultiple
attribute should be set to TRUE
to enable the client to launch multiple instances of the plug-in in separate processes.
IRMARenderer::GetRendererInfo
, which returns functional information about the renderer:
pStreamMimeTypes
indicates which stream MIME types the renderer handles. The client uses this information when determining which renderer to use for a stream.
unInitialGranularity
indicates how often the renderer wants to receive timeline synchronization information from the client. The minimum time is 20 milliseconds.
![]() |
Additional Information |
---|
See "Timing and Synchronization". |
When the RealSystem client receives a stream, it identifies the appropriate rendering plug-in to use based on the stream's MIME type and the pStreamMimeTypes
values returned by the rendering plug-ins during start-up. If two or more plug-ins handle the same MIME type, the client uses the first plug-in for that MIME type that it loaded during start-up. The following actions occur during rendering plug-in initialization:
IRMAPlugin::InitPlugin
method, passing it a pointer to the system context. The plug-in can use this method to peform any necessary initialization procedures. It should also use the context pointer to store a reference to IRMACommonClassFactory
so that it can later create RealSystem objects used in rendering data.
IRMARenderer::StartStream
to give the plug-in access to the client through IRMAPlayer
, as well as to give it access to the stream to be rendered through IRMAStream
. Within IRMARenderer::StartStream
, the renderer can perform any other initialization functions based on its supported features:
![]() |
Additional Information |
---|
See "Chapter 12: Sites (Windowing)". |
IRMABackChannel
communications to the file format plug-in.
IRMAASMStream
to see if ASM is supported. If so, the renderer can use IRMAASMStream::AddStreamSink
to set itself up to receive the client's rule subscription choices. If it receives RTP payloads, the renderer must do this to find out if the RTP payload marker bit is on or off.
![]() |
Additional Information |
---|
See "Chapter 11: Adaptive Stream Management" for information on ASM. The marker bit property is described in "RTP Marker Bit Property". For more in RTP payloads, see "Supporting Multiple Packet Formats". |
IRMARenderer::OnHeader
to pass the renderer a pointer to the stream header object created by the file format plug-in. The renderer uses IRMAValues
methods to retrieve the header data and then releases the object.
IRMARenderer::GetDisplayType
to get the renderer's preferred display type. The renderer returns RMA_DISPLAY_WINDOW
if it renders data on the screen or RMA_DISPLAY_NONE
if it does not use a screen because it is, for example, an audio renderer.
The client and rendering plug-in use the following methods to render a data stream after the renderer has been initialized:
IRMARenderer::OnBegin
method to inform the renderer that playback has begun or has resumed after a pause. It passes the renderer the stream's timeline value in milliseconds. This is zero if the stream is just starting. When resuming after a pause, the client passes an integer value that indicates how many milliseconds into the stream timeline to begin the playback.
IRMARenderer::OnPacket
each time a packet is ready (or should be ready but is lost), passing it a pointer to the IRMAPacket
object to be rendered. The method includes the packet's time offset from the start of the stream.
IRMAPacket::IsLost
method to determine if the packet has been lost. The rendering plug-in is responsible for taking the appropriate action to handle the packet loss.
IRMARenderer::OnBuffering
to inform the renderer of data buffering. The method includes the reason for buffering and the percent complete.
IRMARenderer::OnTimeSync
method periodically depending on the value of unInitialGranularity
returned by the renderer on client start-up. The method returns the current playback time. The renderer uses this information to synchronize playback of its stream with the presentation.
![]() |
Additional Information |
---|
See "Timing and Synchronization". |
Audio Services provides a device-independent, cross-platform interface that allows multiple renderers to share the audio device. It also allows access to the data sent to the audio device. This lets the plug-in add audio effects or perform other processing.
![]() |
Additional Information |
---|
See "Chapter 13: Audio Services". |
RealSystem site interfaces let the rendering plug-in perform windowing actions on multiple platforms through generic code.
![]() |
Additional Information |
---|
See "Chapter 12: Sites (Windowing)". |
The renderer can render data in IRMARenderer::OnPacket
or store packets until IRMARenderer::OnTimeSync
is called. (Note that because of COM's automatic support for object lifetime, storing packets does not require copying the data.) Renderers that use Audio Services, for example, can render in IRMARenderer::OnPacket
and let Audio Services perform the timeline synchronization. Video renderers typically render in IRMARenderer::OnTimeSync
.
IUnknown::Release
when finished with the data. The renderer provided with the SDK sample code, for example, keeps only the last packet.
During a presentation, the user may use the client's seek function to move the presentation to a new point in its timeline. In this case, the client performs the following actions:
IRMARenderer::OnPreSeek
, passing the last stream time value before the seek and the new time value for when the seek completes.
IRMARenderer::OnPacket
and IRMARenderer::OnBuffering
to pass the renderer any buffered packets that post-date the seek action. Although the renderer should not render this data, it has access to the packets for any purpose. It can simply release the packets if it has no need for them.
IRMARenderer::OnPostSeek
. The call includes the time in the stream timeline when the seek occurred, and the new timeline value following the seek.
IRMARenderer::OnPacket
, IRMARenderer::OnBuffering
, and IRMARenderer::OnTimeSync
to pass the renderer packets for data beginning at the new point in the presentation timeline and provide synchronization information.
![]() |
Additional Information |
---|
See "Timing and Synchronization". |
The client performs the following actions if the user pauses the RealSystem presentation:
IRMARenderer::OnPause
, providing the renderer with the stream's time value in milliseconds just before pausing.
IRMARenderer::OnBegin
to inform the renderer that playback has resumed after the pause. It passes the renderer an integer value that indicates the stream's time value in milliseconds after the pause.
Hypernavigation occurs when a rendering plug-in directs the client to display a URL at a specified time in the stream. When the plug-in issues a hypernavigation request, the default Web browser opens. If the browser is open already, the target URL displays in the current window. The plug-in can also specify that the URL display in a specific frame of the current browser window.
A rendering plug-in hypernavigates with IRMAHyperNavigate::GoToURL
. The function takes two parameters, a fully qualified URL and a frame target (NULL for no frame target). The following sample code shows a hypernavigation request that does not target a frame:
m_pHyperNavigate-GoToURL("http://www.real.com", NULL);
If the renderer's corresponding file format plug-in implements IRMABackChannel
, the renderer can send the file format plug-in data in an IRMAPacket
. This can be any feedback or control data that is opaque to the RealSystem architecture and is necessary for plug-in operation.
The rendering plug-in queries for IRMABackChannel
on the IRMAStream
interface. It then calls the IRMABackChannel::PacketReady
method to pass the file format plug-in a pointer to the packet. Note the following, however:
![]() |
Additional Information |
---|
See "Using IRMAPacket to Create Stream Packets" for the basics of packet creation. |
The following actions occur at the end of a presentation:
IRMARenderer::OnEndOfPackets
. The user may still seek backwards through the stream, however, so the renderer should not deallocate resources at this point.
IRMARenderer::EndStream
. The renderer then deallocates resources as necessary. At this point the renderer can no longer access the stream object, but it can still paint to a window and handle interaction such as mouse clicks.
A rendering plug-in has access to several objects that it can use to gather stream information and affect the presentation. During initialization, the client passes the rendering plug-in pointers to the player and stream objects. The renderer can then use IRMAPlayer
to access client player functions (see "Chapter 20: Top-Level Client") and IRMAStream
to get stream information (see below).
The IRMAStream
interface gives the rendering plug-in access to the stream object for a stream it is rendering. The following methods let the plug-in get information about the stream as necessary:
IRMAStream::GetStreamNumber
IRMAStream::GetStreamType
IRMAStream::GetHeader
Returns a pointer to the IRMAValues
interface containing the stream header information. The plug-in also receives this pointer during initialization through IRMARenderer::OnHeader
.
The following IRMAStream
methods allow the renderer to report or modify the level of service:
IRMAStream::ReportQualityOfService
Because the renderer can best judge how problems such as packet loss are affecting the presentation, it can call this method to report a change in service quality. The function passes an integer value denoting the relative level of service, with 0 as the worst possible. When the problem has been eliminated, the renderer reports service quality of 100. The client may report this information in the user interface.
IRMAStream::ReportRebufferStatus
The renderer uses this method to report that available data has dropped critically low. The function takes two values, the number of packets needed to render the presentation smoothly and the number currently available. For example, it calls the function with "5,0" if it needs five packets and none are available. In this case it continues to pass its status ("5,1", "5,2" and so on) until it receives all the packets it needs and calls the function with "5,5".
IRMAStream::SetGranularity
During startup, the plug-in returns a value for unInitialGranularity
through IRMARenderer::GetRendererInfo
. This value sets how frequently it wants to receive timeline synchronization information from the client. The plug-in can later use IRMAStream::SetGranularity
to change this timing granularity.
![]() |
Note |
---|
The IRMAStream::GetRendererCount and IRMAStream::GetRenderer methods
are currently not implemented because RealSystem architecture does not
yet support multiple renderers per stream.
|
![]() |
Additional Information |
---|
See "Timing and Synchronization". |
A rendering plug-in can call IRMAStream::GetSource
to get a pointer to the stream source object, which is the file object from which the rendered stream or streams are generated. The renderer then has access to the following IRMAStreamSource
methods:
IRMAStreamSource::IsLive
Determine if the stream is a live broadcast.
IRMAStreamSource::GetPlayer
Get the interface to the IRMAPlayer
object. The plug-in also receives this pointer during initialization through IRMARenderer::StartStream
.
IRMAStreamSource::GetURL
Get the requested URL for the stream source.
IRMAStreamSource::GetStreamCount
Return the number of streams supported by the source.
IRMAStreamSource::GetStream
Return a pointer to a stream.
For RealSystem client to display images, a renderer must draw them on the client's target display area. The renderer can do this using platform-specific functions (making it a "Windowed Renderer"), or it can use platform-independent RealSystem functions (making it a "Windowless Renderer").
![]() |
Additional Information |
---|
See also "Chapter 12: Sites (Windowing)". |
RealSystem functions are much more convenient than platform-specific functions.To use platform-specific functions, for example, the renderer must handle the PAINT
message appropriate for each platform. To use RealSystem functions, however, the renderer only needs to handle the cross-platform RMA_SURFACE_UPDATE
message.
The RMA client core provides this device-independent windowing functionality to renderers. Each drawing area owned by the client is supplied as a "site." To send display data to the site, the renderer registers as a "site user."
The site's drawing area, or "video surface," is made available to the rendering plug-in through a IRMAVideoSurface
interface. This interface supports a variety of image formats and is uniform across all platforms.
The renderer, in turn, implements the IRMASiteUser
interface so that it can receive RMA event messages (RMA_SURFACE_UPDATE
and others) from the client, and so it can attach, monitor, and detach sites.
The client core also implements the IRMASite2
interface, and the renderer calls the IRMASite2::GetVideoSurface
method to get a IRMAVideoSurface
interface for the site. This IRMAVideoSurface
interface, also implemented by the client core, is used by the renderer to:
rmavsurf.h
for the supported formats)
In addition, the renderer must implement the IRMASiteUser
interface, since its methods are called by the client whenever the renderer must respond to events, either the RMA_SURFACE_UPDATE
message (the surface must be redrawn) or mouse events. The client also uses the IRMASiteUser
interface to attach sites to the renderer as needed for new image data streams, and to detach sites when rendering is complete.
Once the rendering plug-in has obtained the IRMAVideoSurface
for the client site, it is ready to draw images. To do this, the renderer:
IRMAVideoSurface::BeginOptimizedBlt
to set the drawing format, such as the size, compression type, and so on.
IRMAVideoSurface::OptimizedBlt
on the video surface, passing the image data to be displayed. The RealSystem core draws the bitmap on the client's screen.
RMA_SURFACE_UPDATE
event message, or any of the various RMA_MOUSE_EVENTS
(see rmaevent.h
)
IRMAVideoSurface::EndOptimizedBlt
when the drawing is complete.
To get a better understanding of rendering to RealSystem's RMAVideoSurface
, study the sample rendering plug-in located at samples/intermed/exvsrndr/exvsrndr.cpp
. This example clearly demonstrates the use of the video surface and its supporting interfaces. You will also want to refer to rmavsurf.h
, rmawin.h
, and rmasite2.h
.
A RealSystem client can record all or part of a presentation for local playback on the client machine. To support this feature, the renderer must implement IRMAPacketHookHelper
and provide packets for recording to the client. In some cases this may simply mean handing the packets it receives to the IRMAPacketHookHelperResponse
interface. In most cases, though, the renderer must manipulate the data and create new packets. This is because the recording may start after the presentation has begun and hence have a different timeline.
![]() |
Additional Information |
---|
See "Recording a Presentation". |
IRMAPacketHookHelper::StartHook
when the recording begins. This method passes the renderer a pointer to the response object, gives it the number of the stream being recorded, and provides the timeline offset.
The timeline offset is the number of milliseconds into the presentation timeline that the recording began. If the value is 1000, for example, the recording started one second after the presentation began. The renderer is responsible for adjusting the packet timing information to this offset so that the recorded presentation's timeline begins at the offset value.
IRMAPacket
objects that contain the presentation data, as well as values for the stream number and delivery time. The renderer is responsible for sending any necessary data that arrived before the recording started. This may include, for example, video keyframes or vector layout information.
![]() |
Additional Information |
---|
The file format plug-in section discusses packet creation. See also "Using IRMAPacket to Create Stream Packets" for the basics of packet creation. |
IRMAPacketHookHelperResponse::OnPacket
to pass the response object a pointer to the packet object.
IRMAPacketHookHelperResponse::OnEndOfPackets
.
IRMAPacketHookHelper::StopHook
to notify the renderer that the user has stopped recording.
![]() |
Note |
---|
You need RealPlayer Plus G2 to test this feature. |
In RealSystem, the client sends time synchronization intervals to renderers on a per renderer rather than per stream basis. This is because RealSystem supports container datatypes in which multiple renderers read from the same source. On start-up, each renderer requests a time synchronization interval in milliseconds using the unInitialGranularity
attribute. The client then calls IRMARenderer::OnTimeSync
each interval.
![]() |
Note |
---|
Audio Services, when used, sets the interval to the shortest interval requested by any rendering plug-in. These services are explained in "Chapter 13: Audio Services". |
The timestamps assigned to packets are relative to the start time of the stream. Packet times are adjusted by the stream start time and the preroll:
delivery time = packet time - stream preroll + stream start
For example, if stream 1 has a preroll of 3000 milliseconds and a start time of 180000 milliseconds (3 minutes), and the first packets are time stamped 0, 500, and 1000 milliseconds, these packets are delivered to your rendering plug-in at 177000, 177500, and 178000 milliseconds
How the rendering plug-in renders the data is a detail of your datatype. You can implement a scheme whereby you pass the render times down with the packets. Or, you can use some offset from the delivery time. Suppose that you have two distinct data items, item A and item B, in your presentation. Item A is to be displayed 5 seconds into the presentation and item B is to be displayed 10 seconds in. The information that Item A needs to be displayed at 5 seconds is something you should include in your datatype packet payload just as you need to store information about "packetization," such "this packet is part N of M of Item A." You have two options; you can use the "delivery time" values and set a preroll to give your plug-in computation time. Or, you can place rendering time information in your opaque data.
"Duration" for non-time-based object streams, such as still images, is also up to your datatype. A timeline-oriented tool would allow the user to specify the image display time. A powerful tool would allow the user to specify that an image display after a second image has been downloaded. You could, for example support downloading several images while the first one is displayed, then quickly flip through the downloaded image. This is all specific to your datatype implementation.
The RealSystem SDK includes sample rendering plug-ins that you can use as a starting point for building your own plug-in:
/samples/intro/ff1rendr/ff1rendr.cpp
A basic rendering plug-in that renders data streamed from its corresponding, basic file format plug-in. Build these sample files to learn basic concepts about rendering RealSystem packets.
/samples/intermed/exrender/exrender.cpp
This intermediate sample file renders data from the sample file format plug-in /samples/intermed/exffplin/exffplin.cpp to a window. It is designed to support multiwindow rendering.
/samples/intermed/exwnrndr/exwnrndr.cpp
This intermediate sample file is designed to support single-window rendering without the use of the IRMAMultiInstanceSiteUserSupplier
object. It also illustrates how to create and manage child window controls such as buttons.
/samples/advanced/exrender/exrender.cpp
This advanced sample file performs the same functions as the intermediate sample renderer. It also shows how to create a new player to start a new timeline and play continuous background music, or open a different, unrelated URL, such as a stream for advertisement.
![]() |
Note |
---|
The SDK also provides sample renderers that use the Audio Services interface. See "Modifying the Audio Rendering Sample Code". |
Perform the following steps to change the intermediate or advanced sample renderer code. These steps assume your company name is "Foo Bar, Inc.", your file extension is .foo, and the MIME type of your data stream is application/x-foobar
.
CExampleRenderer
with CFooRenderer
, and rename the files foorendr.cpp and foorendr.h.
zm_pDescription
, zm_pCopyright
, and zm_pMoreInfoURL
. For the Foo example, you could change the values as follows:
char* CFooRenderer::
zm_pDescription = "Foo Rendering Plug-in";
char* CFooRenderer::
zm_pCopyright = "(c)1997 Foo Bar";
char* CFooRenderer::
zm_pMoreInfoURL = "http://www.foobar.com";
zm_pStreamMimeTypes
. For the Foo example, you could change the values as follows:
char* CFooRenderer::
zm_pStreamMimeTypes = "application/x-foobar";
![]() |
Additional Information |
---|
"Compiling a Plug-In". |