RSX 3D  Contents  Interfaces  Data Structures  Previous  Next

Interfaces

The RSX 3D interfaces and abstract base classes contain sets of related methods that provide the tools for rendering 3D audio. These methods let you modify data structures, set parameters and flags, and retrieve audio information.

RSX 3D contains five interfaces and two abstract base classes. Applications cannot use the abstract base class methods directly. The emitters or listeners provide the abstract base class functionality through their respective interfaces.

RSX 3D COM Interface/Abstract Base Class Descriptions

IRSX2
Primary interface for managing a 3D audio environment. (See IID_IRSX2 in rsx.h)
 
IRSX
Older but still supported interface for managing a 3D audio environment. (See IID_IRSX20 in rsx.h)
 
IRSXEmitter
Abstract base class that provides common services for both cached and streaming emitters. The IRSXEmitter methods control the media behavior, position, and orientation of an emitter in the audio environment.
 
IRSXCachedEmitter
High-level emitter interface using either file, memory, or application resources for its audio source.
 
IRSXStreamingEmitter
Low-level emitter interface that provides methods for handling real-time audio buffers.
 
IRSXListener
Abstract base class that provides common services for both a direct and a streaming listener. The IRSXListener methods control the position and orientation of a listener in the audio environment.
 
IRSXDirectListener
High-level listener interface that provides methods for connecting and disconnecting a direct listener from an output device.
 
IRSXStreamingListener
Low-level listener interface that provides methods to request the synchronous generation of audio output for streaming listeners.

Successful operation of a method is indicated with the standard S_OK return code. Generic COM error codes are all prefixed with E_, while RSX 3D defined error codes are prefixed with RSXERR_. For a detailed description of each error code, refer to Section 9.2.


IRSX2

This is the primary interface for managing a 3D audio environment. The methods for the IRSX2 interface are:

IRSX2 Interface Methods and Meaning

GetEnvironment
Retrieves the characteristics of the audio environment.
 
GetReverb
Retrieves the current reverberation characteristics of the audio environment.
 
SetEnvironment
Sets the characteristics of the audio environment.
 
SetReverb
Set the reverberation characteristics of the audio environment.

IRSX

This is the older yet still supported interface for managing a 3D audio environment. It is recommend that you use the IRSX2 interface instead to modify the audio environment. Instead of using the Create...() functions provided in the IRSX interface, use the COM function CoCreateInstance(..) to construct listeners and emitters. For more information on creating emitters and listeners. The methods for the IRSX interface are:

IRSX Interface Methods and Meaning

CreateCachedEmitter
Adds a file-based sound emitter to the environment and returns an IRSXCachedEmitter interface.
 
CreateDirectListener
Adds a listener to the environment and returns an IRSXDirectListener interface.
 
CreateStreamingEmitter
Adds a streaming sound emitter to the environment and returns an
IRSXStreamingEmitter interface.
 
CreateStreamingListener
Adds a streaming listener to the environment and returns an IRSXStreamingListener interface.
 
GetEnvironment
Retrieves the characteristics of the audio environment.
 
GetReverb
Retrieves the current reverberation characteristics of the audio environment.
 
SetEnvironment
Sets the characteristics of the audio environment.
 
SetReverb
Set the reverberation characteristics of the audio environment.

CreateCachedEmitter

Registers a new file-based sound in the 3D audio environment.

HRESULT	CreateCachedEmitter(lpCachedEmitterAttr, lplpCachedEmitterInterface, 
                            reserved) 

Parameters

LPRSXCACHEDEMITTERDESC lpCachedEmitterAttr
Pointer to an RSXCACHEDEMITTERDESC data structure that specifies the filename of the sound the emitter will play.
 
LPRSXCACHEDEMITTER FAR* lplpCachedEmitterInterface
Address that RSX fills with a pointer to an IRSXCachedEmitter interface if the call succeeds.
 
IUnknown FAR * reserved
Reserved and must be NULL.

Returns

S_OK
E_INVALIDARG
E_OUTOFMEMORY
E_FAIL
RSXERR_FILENOTFOUND
RSXERR_FILESHARINGVIOLATION
RSXERR_CORRUPTFILE

Description

Emitters define the objects in an audio environment. RSX does not limit the number of emitters you can create; only the computer's processing power and memory resources limit the number of emitters an application can handle. An audio application can have two types of emitters: cached or streaming.

A cached emitter adds a new file-based sound to the 3D audio environment and returns an IRSXCachedEmitter interface. Each cached emitter is a single file-based sound source. You can describe an emitter in a .WAV, .MID, .EXE, or .DLL file. Use the IRSXEmitter abstract base class methods to control the media behavior, position, orientation, and audio properties of an emitter and the IRSXCachedEmitter methods to handle file, memory, or application resources for an emitter.

NOTE: RSX can play only one .MID file at a time.

A streaming emitter lets your application receive real-time audio data. For more information about streaming emitters, refer to the IRSXStreamingEmitter interface.

Usage Sample

/*
// Create a Cached Emitter
*/
	RSXCACHEDEMITTERDESC rsxCE;             

// emitter description
	ZeroMemory(&rsxCE, sizeof(RSXCACHEDEMITTERDESC));
	rsxCE.cbSize = sizeof(RSXCACHEDEMITTERDESC);
	rsxCE.dwFlags = 0;

// If you want to use a wave file embedded as a resource 
// instead of a file replace "cppmin.wav" with "cppmin.exe RESOURCE_ID"
// i.e. "cppmin.exe 15" or try this:
// sprintf(aeDesc.szFilename,"cppmin.exe %d", IDR_WAVE1);
//
	strcpy(rsxCE.szFilename,"cppmin.wav");
	rsxCE.dwUser = 50;
	
// Create the cached emitter
	if( SUCCEEDED	(m_lpRSX->CreateCachedEmitter(&rsxCE, &m_lpCE, NULL)) && 
			(m_lpCE) ) 
. . .

CreateDirectListener

Registers a new listener in the 3D audio environment, and returns an interface pointer for modifying the listener's state.

HRESULT CreateDirectListener(lpDirectListenerAttr, lplpDirectListenerInterface, reserved)

Parameters

LPRSXDIRECTLISTENERDESC lpDirectListenerAttr
Pointer to an RSXDIRECTLISTENERDESC that provides reference information about the direct listener object.
 
LPRSXDIRECTLISTENER FAR * lplpDirectListenerInterface
Address that RSX fills with a pointer to an IRSXDirectListener interface if the call succeeds.
 
IUnknown FAR * reserved
Reserved and must be NULL.

Returns

S_OK
E_FAIL
E_INVALIDARG
E_OUTOFMEMORY
RSXERR_BADFORMAT
RSXERR_ALLOCATED
RSXERR_NODRIVER

Description

A listener defines the audio view of the scene. An audio application can have only one listener: direct or streaming.

A direct listener allows RSX 3D to output to a standard audio interface. If you have DirectSound II installed and you provide a handle to the application's main window, RSX 3D uses the DirectSound II (or later) interface. If the user of your application does not have DirectSound II installed, RSX 3D uses the Wave API. Use the IRSXListener abstract base class methods to set the position and orientation of a listener and the IRSXDirectListener methods to connect or disconnect a direct listener from an output device.

A streaming listener lets your application receive processed buffers from RSX 3D. For more information about the streaming listener, refer to the IRSXStreamingListener interface.

Usage Sample

/* 
// Create a Direct Listener
*/
	RSXDIRECTLISTENERDESC rsxDL;            

// listener description
	rsxDL.cbSize = sizeof(RSXDIRECTLISTENERDESC);
	rsxDL.hMainWnd = AfxGetMainWnd()->GetSafeHwnd();
	rsxDL.dwUser = 0;
	rsxDL.lpwf = NULL;

// create the direct listener now
	if( SUCCEEDED(m_lpRSX->CreateDirectListener(&rsxDL, &m_lpDL, NULL)) && 
		(m_lpDL) ){. . . 

CreateStreamingEmitter

Registers a new streaming sound in the 3D audio environment, and returns an interface pointer for controlling the emitter.

HRESULT CreateStreamingEmitter(lpStreamingEmitterAttr, lplpStreamingEmitterInterface, 
                               reserved); 

Parameters

LPRSXSTREAMINGEMITTERDESC lpStreamingEmitterAttr
Pointer to an RSXSTREAMINGEMITTERDESC that specifies the format of the buffers to be streamed by the emitter.
 
LPRSXSTREAMINGEMITTER FAR * lplpStreamingEmitterInterface
Address that RSX 3D fills with a pointer to an IRSXStreamingEmitter interface if the call succeeds.
 
IUnknown FAR * reserved
Reserved and must be NULL.

Returns

S_OK
E_FAIL
E_INVALIDARG
E_OUTOFMEMORY
RSXERR_BADFORMAT

Description

Emitters define the objects in an audio environment. RSX 3D does not limit the number of emitters you can create; only the computer's processing power and memory resources limit the number of emitters an application can handle. An audio application can have two types of emitters: cached or streaming.

A cached emitter adds a new file-based sound to the 3D audio. For more information about cached emitters, refer to the IRSXCachedEmitter interface.

A streaming emitter lets your application receive real-time audio data. Creating a streaming emitter requires specifying a buffer format and approximate size. RSX 3D uses PCM format internally. When buffers are submitted in non-PCM formats, there is a real-time conversion performed using the appropriate ACM driver.

If the ACM driver performance is not adequate, the application should handle the decompression to PCM prior to submitting each buffer. RSX 3D signals the application when it finishes using a buffer of emitter data. The application can then reuse the same buffer to resubmit another buffer of emitter data to RSX 3D. At one time, an application can submit any number of buffers to RSX 3D.

Use the IRSXEmitter abstract base class methods to control the media behavior, position, orientation, and audio properties of an emitter and the IRSXStreamingEmitter methods to specify the buffer format and approximate size.

Usage Sample

/*
// CreateStreamingEmitter Sample
*/

RSXSTREAMINGEMITTERDESC seDesc;
memset(&seDesc, 0, sizeof(RSXSTREAMINGEMITTERDESC));
seDesc.cbSize = sizeof(RSXSTREAMINGEMITTERDESC);

seDesc.lpwf = new WAVEFORMATEX;
if(!seDesc.lpwf) {
    return 0;
}

// fill in the waveformat with values
// for 22kHz, 16-bit, mono, PCM format
seDesc.lpwf->wFormatTag = WAVE_FORMAT_PCM;
seDesc.lpwf->nChannels = 1;
seDesc.lpwf->nSamplesPerSec = 22050;
seDesc.lpwf->nAvgBytesPerSec = 44100;
seDesc.lpwf->nBlockAlign = 2;
seDesc.lpwf->wBitsPerSample = 16;
seDesc.lpwf->cbSize = 0;

// create a streaming emitter 
hr = m_lpRSX->CreateStreamingEmitter(&seDesc, &m_lpSE, NULL);
if(SUCCEEDED(hr)) {
. . .

CreateStreamingListener

Registers a new streaming listener in the 3D audio environment, and returns an interface pointer for modifying the listener's state.

HRESULT CreateStreamingListener(lpStreamingListenerAttr, 
                                lplpStreamingListenerInterface, reserved)

Parameters

LPRSXSTREAMINGLISTENERDESC lpStreamingListenerAttr
Pointer to an RSXSTREAMINGLISTENERDESC that provides reference information about the streaming listener object.
 
LPRSXSTREAMINGLISTENER FAR * lplpStreamingListenerInterface
Address that RSX fills with a pointer to an IRSXStreamingListener interface if the call succeeds.
 
IUnknown FAR * reserved
Reserved and must be NULL.

Returns

S_OK
E_FAIL
E_INVALIDARG
E_OUTOFMEMORY
RSXERR_BADFORMAT

Description

A listener defines the audio view of the scene. An audio application can have only one listener: direct or streaming.

A direct listener allows RSX 3D to output to a standard audio interface. For more information about the direct listener, refer to the IRSXDirectListener interface.

A streaming listener lets your application receive processed buffers from RSX. Creating a streaming listener requires specifying the buffer format and approximate size. The buffer format for streaming listeners must be PCM. Use the IRSXListener abstract base class methods to set the position and orientation of a listener and the IRSXStreamingListener methods to request the synchronous generation of audio output.

The listener streaming interface is synchronous; buffers are returned on demand. This non-blocking interface simplifies the connection to real-time devices and allows you to have explicit control over the processor usage. In this model, you can use the RSX 3D library as a transform filter in the context of any real-time streaming environment. The intention is that stream pacing is provided by the device where buffers are written.

Usage Sample

/* 
// Create Streaming Listener
//
// Fill WAVEFORMATEX structure to specify output format
// Fill RSXSTREAMINGLISTENERDESC to specify listener creation settings
*/
m_wfx.wFormatTag = WAVE_FORMAT_PCM;
m_wfx.nChannels = 2;
m_wfx.nSamplesPerSec = 22050;
m_wfx.nAvgBytesPerSec = 88200;
m_wfx.nBlockAlign = 4;
m_wfx.wBitsPerSample = 16;
m_wfx.cbSize = 0;

DWORD dwListenerBufferSizeInMS = 200;   //Lets try 200ms buffers in 
                                        //this example
RSXSTREAMINGLISTENERDESC slDesc;	

slDesc.cbSize = sizeof(RSXSTREAMINGLISTENERDESC);
slDesc.dwRequestedBufferSize = 
                dwListenerBufferSizeInMS * m_wfx.nAvgBytesPerSec / 1000;	
slDesc.lpwf =   &m_wfx;
slDesc.dwUser = 0;

if(SUCCEEDED(m_lpRSX->CreateStreamingListener(&slDesc,&m_lpSL, NULL))) {
. . .

GetEnvironment

Retrieves the default characteristics of the audio environment.

HRESULT GetEnvironment(lpEnvAttr)

Parameters

LPRSXENVIRONMENT lpEnvAttr
Pointer to an RSXENVIRONMENT data structure to be filled with the current environment settings of the audio environment.

Returns

S_OK
E_INVALIDARG

Description

The environment includes the coordinate system type (either right- or left-handed), the speed of sound, and the processing budget for audio localization. RSX 3D fills the data structure with all the environment values.

Usage Sample

/*
// Environment
*/
	RSXENVIRONMENT rsxEnv;
	
/*
// Check which coordinate system is selected.
*/
	rsxEnv.cbSize = sizeof(RSXENVIRONMENT);
	lpRSX->GetEnvironment(&rsxEnv);
	if (rsxEnv.bUseRightHand == TRUE){
	    //Right-hand coordinate system is being used
	} else {
	    //Left-hand coordinate system is being used
	} 

See Also

SetEnvironment


GetReverb

Retrieves the current reverberation characteristics of the audio environment.

HRESULT GetReverb(lpReverbModel)

Parameters

LPRSXREVERBMODEL lpReverbModel
Pointer to an RSXREVERBMODEL data structure to be filled with the current reverberation settings for the audio environment.

Returns

S_OK
E_INVALIDARG

Description

GetReverb returns a pointer to the RSXEREVERBMODEL data structure. This structure contains the current reverberation model settings for audio decay time, intensity, and state (on or off). These settings let you model acoustics such as those experienced in rooms, tunnels, and halls.

Usage Sample

/*
// Get Reverberation characteristics
/*
    RSXREVERBMODEL rsxRvb;
    rsxRvb.cbSize = sizeof(RSXREVERBMODEL);
    m_lpRSX->GetReverb(&rsxRvb); 
. . .

See Also

SetReverb

SetEnvironment

Modifies the default characteristics of the audio environment.

HRESULT SetEnvironment(lpEnvAttr);

Parameters

LPRSXENVIRONMENT lpEnvAttr
Pointer to an RSXENVIRONMENT data structure that specifies the environment settings of the audio environment.

Returns

S_OK
E_INVALIDARG

Description

The environment includes parameters to specify the coordinate system type (either right- or left-handed). Default environment settings will be used if the lpEnvAttr pointer is NULL.

Usage Sample

/*
// Environment
*/
	RSXENVIRONMENT rsxEnv;
	
/*
// Set right-handed coordinate system.
*/
	rsxEnv.cbSize = sizeof(RSXENVIRONMENT);
	rsxEnv.dwFlags = RSXENVIRONMENT_COORDINATESYSTEM;
	rsxEnv.bUseRightHand = TRUE;
	lpRSX->SetEnvironment(&rsxEnv);
. . .

See Also

GetEnvironment

SetReverb

Specifies the reverberation characteristics for the audio environment.

HRESULT SetReverb(lpReverbModel);

Parameters

LPRSXREVERBMODEL lpReverbModel

Pointer to an RSXREVERBMODEL data structure that specifies the reverberation settings for the audio environment.

Returns

S_OK
E_INVALIDARG

Description

Reverberation characteristics include the reverberation state (on/off), as well as the delay time and reverberation intensity.

Usage Sample

// Turn reverberation on
   void CMainWindow::OnReverbOn() 
   {
    	RSXREVERBMODEL rsxRvb;
    	rsxRvb.cbSize = sizeof(RSXREVERBMODEL);
	rsxRvb.bUseReverb = TRUE;
	rsxRvb.fDecayTime = 1.0f;
	rsxRvb.fIntensity = 0.2f;
    	m_lpRSX->SetReverb(&rsxRvb);
    }

See Also

GetReverb

RSX 3D  Contents  Interfaces  Data Structures  Previous  Next

Copyright ©1996, 1997 Intel Corporation. All rights reserved