<oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
  <dc:description xml:lang="eng">In order to test the LISHPh MOBRIR rendering method open the MOBRIR_Rendering.pd File in puredata (http://puredata.info).
Make sure to obtain the zexy and iemlib externals, install them and specify their paths
(e.g. by selecting &quot;help&quot; and &quot;find externals from the internet&quot; in puredata)

In the MOBRIR_Rendering patch you can listen to dynamic (variable-orientation) headphone rendering of the center
loudspeaker of the IEM Production Studio from the 7 recorded/pre-rendered dummy head orientations
-45°,-30°,-15°,0°,15°,30°,45°. In default of a headtracker you can use a slider to change the orientation,
or the auto-rotate toggle switch.
The audio signals are pre-rendered and you can choose between pink noise and a music example.

When setting the crossover to phase switching to fc = 2000Hz (Default) you listen to a properly adjusted  LISHPh method.
When setting it to fc = 20000Hz, you essentially get broadband linear interpolation, which is clearly heard when listening
to the pink noise sample. By clicking the message box with the &quot;musicvox&quot; example (What&#39;s trumps by Rhythmus-Sportgruppe,
see https://zenodo.org/communities/dega-audiodatabase-for-virtual-acoustics/?page=1&amp;size=20).
</dc:description>
  <dc:type xml:lang="eng">annotation</dc:type>
  <dc:identifier>https://phaidra.kug.ac.at/o:97087</dc:identifier>
  <dc:title xml:lang="eng">MOBRIR Rendering in puredata</dc:title>
  <dc:rights>Public Domain Mark</dc:rights>
  <dc:rights>http://creativecommons.org/publicdomain/mark/1.0/</dc:rights>
  <dc:language>eng</dc:language>
  <dc:creator>Zaunschirm, Markus (IEM)</dc:creator>
  <dc:format>application/x-zip-compressed</dc:format>
</oai_dc:dc>