RSX 3D Contents Interfaces Data Structures Previous Next
Here are the steps to take to add sound to your application. The methods discussed in this section are those most commonly used and most easily applied. To add sound to your application, you need to be able to create a listener and at least one emitter. These steps show you how to:
To to achieve even more realistic sound., you can add effects such as these to your application:
Before using this library, you must call the Component Object Model (COM) initialization function CoInitialize. Typically, you do this during application initialization. When you terminate the application, you must call CoUninitialize, provided CoInitialize succeeded. To use these standard OLE services, you must link your application with the OLE32.LIB library.
NOTE. For clarity, the sample code shown here uses C++ syntax.
/* // Initialize COM Library */ HRESULT hr; hr = CoInitialize(NULL); /* or use the OLE command OLEInitialize */ . . . if(SUCCEEDED(hr)){ /* // Tell the COM Library we are all done */ CoUninitialize(); /* or use the OLE command OLEShutdown */ } /* if */
Using the RSX 3D library requires creating the RSX object and requesting either a pointer to an IRSX2 interface or an IUnknown interface. You do this by calling CoCreateInstance, and specifying the class ID for RSX, CLSID_RSX20, and the interface ID for the IRSX2 interface, IID_IRSX2 or IID_IUnknown. You can find the definition for IRSX2 in rsx.h. To use IID_IUnknown make sure you link against uuid.lib.
/* // Specify the class Id for RSX and the interface ID for the // IRSX2 interface */ IUnknown* lpRSXUnk; IRSX2* lpRSX; hr = CoCreateInstance(CLSID_RSX20, NULL, CLSCTX_INPROC_SERVER, IID_IRSX2, (void **) &lpRSX); hr = CoCreateInstance(CLSID_RSX20, NULL, CLSCTX_INPROC_SERVER, IID_IUnknown, (void **) &lpRSXUnk);
If you request an IID_IRSX2 interface you can then call functions defined in this interface such as SetReverb(). The pointer returned from a request for the IID_IUnknown interface is required and it will be passed as a parameter to the Initialize() function for Listeners and Emitters.
If the initialization is successful and the call to CoCreateInstance returns a non-NULL interface pointer, the application may proceed to use the services of the RSX 3D library. The following section highlights the key features of the library for simple integration with 3D graphics libraries.
You can specify the environment for audio rendering with the IRSX2 interface's SetEnvironment method. To do this, use the dwFlags member of the RSXENVIRONMENT structure to indicate which features you would like to adjust.
NOTE. The parameter cbSize specifies the size in bytes of an RSX data structure. RSX structures all have a cbSize member that you must initialize when using these data structures. RSX functions return an E_INVALIDARG error code when called with an incorrect cbSize.
The following example demonstrates selecting a right-handed coordinate system.
/* // Environment */ RSXENVIRONMENT rsxEnv; /* // Set right-handed coordinate system. */ rsxEnv.cbSize = sizeof(RSXENVIRONMENT); rsxEnv.dwFlags = RSXENVIRONMENT_COORDINATESYSTEM; rsxEnv.bUseRightHand = TRUE; lpRSX->SetEnvironment(&rsxEnv);
A listener defines the audio view of a scene. By moving the listener, you can control audio, based on the location and orientation of the audio view with respect to the sound emitters. A direct listener object allows RSX to output to a standard audio interface. If you have DirectSound II (or later) installed and can provide a handle to the application's main window, RSX 3D uses the DirectSound II interface. If the user of your application does not have DirectSound II installed, RSX 3D uses the Wave API. RSX 3D limits the number of listeners you can create to one.
To create a direct listener, create an RSXDIRECTLISTENER object and request a pointer to the IID_IRSXDirectListener interface. For more details on creating a direct listener.
Along with the ability to create a direct listener, RSX 3D lets you define a streaming listener. Streaming listeners provide access to processed buffers from RSX 3D.
The IRSXListener abstract base class provides methods that let you set the position and orientation of a listener object and let you get information about a listener. They apply to both direct and streaming listeners.
Table 1. Abstract Base Class Listener Methods
This method | Does the following |
GetOrientation | Retrieves the direction and up vectors of the listener. |
GetPosition | Retrieves the x,y,z coordinates of the listener. |
GetUserData | Retrieves the user-defined data for the listener. |
SetOrientation | Sets the direction and up vectors of the listener. |
SetPosition | Sets the x,y,z coordinates of the listener. |
The IRSXDirectListener interface provides methods that let you attach, detach, or reattach the listener to the output device resource. The output device resource can be either DirectSound II or Wave API. The direct listener methods are in addition to the IRSXListener abstract base class methods, which let you set the position and orientation of either a direct or streaming listener object.
Table 2. Direct Listener Methods
This method | Does the following |
Connect | Reattaches the listener to the output device resource. |
Disconnect | Detaches the listener from the output device resource. |
When you set the camera position and orientation for the graphics rendering, you should also update the audio listener position and orientation. You communicate this information to RSX 3D through a listener object.
To create a direct listener, create an RSXDIRECTLISTENER object and request a pointer to the IID_IRSXDirectListener interface.
Note that you need to provide the window handle for the application's main window when you create a direct listener. If access to the application's main window is not available, specify NULL and RSX will not attempt to use DirectSound. (DirectSound requires a valid main window handle. If DirectSound rejects the window handle provided, RSX 3D uses the higher latency Wave API). If you want your application to use the default audio format selected by RSX, specify NULL for the lpwf field in the RSXDIRECTLISTENERDESC data structure.
The example below shows one technique for getting the main window handle, accessing the direct listener interface, and specifying the default sample rate for an MFC-based application.
/* // Create a direct listener and save the IRSXDirectListener interface */ IUnknown* lpRSX; RSXDIRECTLISTENERDESC dlDesc; IRSXDirectListener* lpDL; dlDesc.cbSize = sizeof(RSXDIRECTLISTENERDESC); dlDesc.hMainWnd = AfxGetMainWnd()->GetSafeHwnd(); dlDesc.dwUser = 0; dlDesc.lpwf = NULL; //Use default Wave format // Create a Direct Listener object and get the IRSXDirectListener interface hr = CoCreateInstance( CLSID_RSXDIRECTLISTENER, NULL, CLSCTX_INPROC_SERVER, IID_IRSXDirectListener, (void ** ) &lpDL); . . .
Once you have created a listener you must initialize it before you can call any other functions on the IRSXDirectListener interface. Initializing the listener will actually connect the listener to the system audio device and attach the listener to an IRSX2 environment. The initialize call may fail if another application is currently using the audio device.
The example below demonstrates the Initialize function.
// // All emmiter and listener objects have an initialize function // This function MUST be called before calling any other // functions on the interface and it may be called // ONCE and ONLY ONCE. Subsequent calls to initialize // will fail. Calls to any other function before initializing // will also fail. // The second parameter to the initialize call is always // an IUnknown* to an RSX20 object. // The first parameter in the directlistener call is // the address of an RSXDIRECTLISTENERDESC // hr = lpDL->Initialize(&dlDesc, lpRSX);
Once you have initialized a listener, you can specify its position and orientation at any time by calling IRSXListener's SetPosition and SetOrientation methods. You can retrieve these properties with corresponding Get methods. In general, any time the graphical camera position or orientation changes, you should also update the listener position and/or orientation.
In the following example, which shows how to create and set
the position and orientation of a direct listener, calls to
GraphicsLibraryxxx represent calls to an imaginary graphics
library. (The desired graphic function, SetCameraPosition or
SetCameraOrientation, replaces the symbol xxx.)
os.x = 0.0f; pos.y = 0.0f; pos.z = 0.0f; lpDL->SetPosition(&pos); GraphicsLibrarySetCameraPosition(&pos); dir.x = 0.0f; dir.y = 0.0f; dir.z = 1.0f; up.x = 0.0f; up.y = 1.0f; up.z = 0.0f; lpDL->SetOrientation(&dir, &up); GraphicsLibrarySetCameraOrientation(&dir, &up); . . .
Emitters are the audio objects that populate a scene. There are two types of emitters: a high-level, file-based emitter called the cached emitter, and a real-time, buffer-based emitter called the streaming emitter. Each cached emitter is a single file-based sound source. RSX does not limit the number of emitters you can create. You are limited only by your computer's processing power and memory resources.
To create a a cached emitter, create an RSXCACHEDEMITTER object and request a pointer to the IID_IRSXCachedEmitter interface. For more details on creating a cached emitter.
Along with the ability to create cached emitters, RSX 3D lets you define streaming emitters. Streaming emitters let you process real-time data.
The IRSXCachedEmitter
abstract base class provides methods that let you set the
position, orientation, pitch, state, model, and processor-loading
budget for an emitter object and let you get information about an
emitter. They apply to both cached and streaming emitters.
Table 3. Abstract Base Class Emitter Methods
This method | Does the following |
GetCPUBudget | Retrieves the emitter's budget for processing |
GetModel | Retrieves the emitter's model parameters. |
GetMuteState | Retrieves the emitter's mute state. |
GetOrientation | Retrieves the emitter's orientation 3D vector. |
GetPitch | Retrieves the emitter's pitch adjustment. |
GetPosition | Retrieves the emitter's x,y,z coordinates. |
GetUserData | Retrieves the user-defined data for the emitter. |
QueryMediaState | Queries the state of the emitter. |
SetCPUBudget | Sets the processing budget for an emitter. |
SetModel | Updates the emitter's model parameters. |
SetMuteState | Sets the emitter's mute state. |
SetOrientation | Sets the emitter's orientation 3D vector. |
SetPitch | Specifies the emitter's pitch. |
SetPosition | Sets the x,y,z coordinates of the emitter. |
The IRSXCachedEmitter interface provides methods that let you control the play, pause, resume and stop functions of a cached emitter, and set and retrieve a cached file's length and playback marks. These methods are in addition to the IRSXEmitter abstract base class methods, which let you set the position, orientation, pitch, state, model, and budget for processing on a cached emitter or a streaming emitter object.
Table 4. Cached Emitter Methods
This method | Does the following |
ControlMedia | Performs standard media-control functions on the sound emitter. |
GetCacheTime | Retrieves the pre-load length for the emitter's file. |
GetMarkPosition | Retrieves the begin and end playback marks. |
SetCacheTime | Sets the pre-load length for the emitter's file. |
SetMarkPosition | Sets the begin and end playback marks. |
When you set the position and orientation for graphical objects, you should also update each audio emitter's position and orientation.
Adding sound to an environment is a straightforward process. In the simplest case, you create an RSXCACHEDEMITTER object and request an IID_IRSXCachedEmitter interface. Next, as with the direct listener, you must initialize the cached emitter by calling the Initialize function. After successfully initializing the cached emitter you can proceed to adjust the position and orientation of the emitter object using IRSXCachedEmitter's SetPosition and SetOrientation methods. You can set additional properties such as the intensity and audio range using the SetModel method. Use the corresponding Get methods to retrieve current values for these properties.
When you create an emitter, you can specify several flags with the dwFlags member, including the option to preprocess emitter data for better run-time performance. Note that preprocessing is useful only on non-PCM data and requires overhead at creation time. During preprocessing, RSX converts non-PCM Wave files into PCM Wave files. Preprocessing adds to the time it takes to create an emitter, but uses less processing resources while the file is playing. The following code demonstrates creating a cached emitter.
/* // Creating a cached emitter */ IUnknown* lpRSX; RSXCACHEDEMITTERDESC rsxCE; // emitter description RSXEMITTERMODEL rsxModel; IRSXCachedEmitter* lpCE; RSXVECTOR3D p; hr = CoCreateInstance( CLSID_RSXCACHEDEMITTER, NULL, CLSCTX_INPROC_SERVER, IID_IRSXCachedEmitter, (void ** ) &lpCE); if(SUCCEEDED(hr)) { ZeroMemory(&rsxCE, sizeof(RSXCACHEDEMITTERDESC)); rsxCE.cbSize = sizeof(RSXCACHEDEMITTERDESC); rsxCE.dwFlags = 0; rsxCE.dwUser = 50; // If you want to use a wave file embedded as a resource // instead of a file replace "cppmin.wav" with "cppmin.exe RESOURCE_ID" // i.e. "cppmin.exe 15" or try this: // sprintf(aeDesc.szFilename,"cppmin.exe %d", IDR_WAVE1); // // OR if you have installed Microsoft Internet Explorer 3.0 // and it is configured correctly you // can use an URL-based emitter like: // http://www.junk.com/mywaves/cppmin.wav // or // ftp:://ftp.junk.com/waves/cppmin.wav // strcpy(rsxCE.szFilename,"cppmin.wav"); // initialize the CachedEmitter and 'attach' it to // the same RSX20 object as the DirectListener // otherwise we won't hear anything hr = lpCE->Initialize(&rsxCE, lpUnk); // Set the position. p.x = 0.0f; p.y = 0.0f; p.z = 0.0f; lpCE->SetPosition(&p); // Set the orientation. p.x = 0.0f; p.y = 0.0f; p.z = 1.0f; lpCE->SetOrientation(&p); // Define the limits for the ambient and attenuation regions. rsxModel.cbSize = sizeof(RSXEMITTERMODEL); rsxModel.fMinBack = 2.0f; rsxModel.fMinFront = 2.0f; rsxModel.fMaxBack = 3.0f; rsxModel.fMaxFront = 10.0f; rsxModel.fIntensity = 0.5f; lpCE->SetModel(&rsxModel); } // endif
To control audio playback, use the IRSXCachedEmitter interface's ControlMedia method. You can use this method to play, pause or restart an emitter's audio at any time during a cached emitter's lifetime.
A complementary method, QueryMediaState, provides real-time state information for the specified emitter, such as play state, seconds played, remaining loops, and total play time. The following code sample shows how to start playing an emitter's audio continuously from the beginning.
lpCE->ControlMedia(RSX_PLAY,0, 0);
Additionally, you can specify an initial start position. For the first loop, play starts from this position, but the succeeding loops start from the beginning of the file. The following code demonstrates playing an emitter for five loops, starting the first loop ten seconds from the beginning.
lpCE->ControlMedia(RSX_PLAY,5, 10.0f);
For greater flexibility, you can specify the start position and end position for playback by using the SetMarkPosition method. The default is to play the entire file. The following demonstrates looping five times over the range from 3.5 to 6.75 seconds.
lpCE->SetMarkPosition(3.5, 6.75); lpCE->ControlMedia(RSX_PLAY, 5, 0);
You can combine the SetMarkPosition with a non-zero
initial play position to initiate play from the initial play
position, play until mark end position, and resume looping from
the mark start position. A corresponding GetMarkPosition returns
the currently defined start and end positions.
NOTE. Set the start
position, end position, and cache times before playing the file.
If you use the SetMarkPosition and SetCacheTime
methods while a cached emitter file is playing, RSX will return
an error.
Reverberation adds realism to an audio environment by letting you model acoustics such as those experienced in rooms, tunnels, and halls. Reverberation is the echo or the persistence of a sound to continue after the original impulse stops.
To model a reverberation effect, you adjust parameters that control audio decay time and intensity. All you need to do to specify a reverberation model, is fill out a RSXREVERBMODEL data structure and pass it to the SetReverb method. The corresponding GetReverb method provides a description of the current reverberation model.
The following code demonstrates setting the audio environment to model a stage. See the definition of the RSXREVERBMODEL data structure for information about additional reverberation parameters you can use to model other audio environments.
RSXREVERBMODEL rvb; rvb.cbSize = sizeof(RSXREVERBMODEL); lpRSX->GetReverb(&rvb); rvb.bUseReverb = TRUE rvb.fDecayTime = 1.2f rvb.fIntensity = 0.1f lpRSX->SetReverb(&rvb); . . .
The Doppler effect is the apparent change in the frequency of a sound wave which results from the relative velocity between the listener and a given sound emitter. RSX calculates the Doppler effect automatically when rendering a listener, if the speed of sound is non-zero. RSX calculates velocity based on the change of position of the emitter from frame to frame and adjusts the pitch of the emitter to simulate the Doppler effect.
When you create an emitter, you can disable the Doppler calculation for that emitter by specifying the RSXEMITTERDESC_NODOPPLER flag.
RSX 3D provides two controls for adjusting the amplitude of a sound signal: spatialization and distance attenuation.
RSX 3D uses the term spatialization to mean adjusting a sound based on the location of the sound emitter in the scene. Is the emitter above, below, to the right or to the left of the listener? RSX 3D takes into account an emitter's placement in a scene and automatically calculates and applies an adjustment.
RSX 3D also makes an attenuation adjustment based on the emitter's distance from the listener. Adjusting for spatialization and distance attenuation is the default. RSX 3D provides flags for you to use to control these adjustments.
RSX 3D provides three levels of rendering an emitter. The default rendering provides both spatialization and distance attenuation. The second level, renders an emitter without spatialization, and the third level renders an emitter without distance. When you turn off the distance attenuation flag, you must also turn off spatialization.
In some situations, you may want to play standard audio though RSX 3D without spatialization on some emitters, using only distance attenuation. You might do this when you are simulating distant sounds and the direction of the sound does not matter. Additionally, it may be desirable to play audio through RSX 3D with no distance attenuation (and no spatialization) effectively using RSX 3D as an audio mixer.
When you create an emitter, you can indicate that you do not want spatialization by specifying the RSXEMITTER_NOSPATIALIZE flag in dwFlags of the RSXCACHEDEMITTERDESC or RSXSTREAMINGEMITTERDESC data structure. In this case, RSX applies only distance attenuation when it renders the emitter.
If you are using multiple-channel audio for the emitter, RSX 3D preserves these channels if they match the channel format of the listener. The following code demonstrates the creation of an emitter without spatialization:
memset(&ceDesc, 0, sizeof(RSXCACHEDEMITTERDESC); ceDesc.cbSize = sizeof(RSXCACHEDEMITTERDESC); ceDesc.dwFlags = RSXEMITTERDESC_NOSPATIALIZE; strcpy(ceDesc.szFilename,"tpot.wav"); ceDesc.dwUser = (DWORD )pGraphicalObject; . . .
You can disable distance attenuation by setting the RSXEMITTERDESC_NOATTENUATE flag in dwFlags of the RSXCACHEDEMITTERDESC or RSXSTREAMINGEMITTERDESC data structure to TRUE. This setting is only valid if you also specify the corresponding RSXEMITTER_NOSPATIALIZE flags.
When you turn off distance attenuation and spatialization, you can use RSX as a high-performance audio mixer. The following code demonstrates the creation of an emitter in this mode:
memset(&ceDesc, 0, sizeof(RSXCACHEDEMITTERDESC); ceDesc.cbSize = sizeof(RSXCACHEDEMITTERDESC); ceDesc.dwFlags = RSXEMITTERDESC_NOSPATIALIZE | RSXEMITTERDESC_NOATTENUATE; strcpy(ceDesc.szFilename,"tpot.wav"); ceDesc.dwUser = (DWORD )pGraphicalObject; . . .
When objects are no longer needed, you must properly destroy them to free their resources. You accomplish this by releasing the object, which allows the system to decrement the object's reference count. RSX 3D destroys objects only when their reference count reaches zero. For more information about reference counting, see the Win32, OLE Programmer's Guide and Reference.
To free an emitter object, call Release.
lpCE->Release();
Use the same technique to free the listener and RSX 3D
objects.
RSX 3D Contents Interfaces Data Structures Previous Next
Copyright ©1996, 1997 Intel Corporation. All rights reserved