For more information, see the Sound class description. If you are working with local files or sounds loaded from a server in aĭifferent domain than the calling content, you might need to address sandbox restrictions
Note: This method is subject to local file security restrictions and The size of the ByteArray object created is fixed to 512 floating-point values, where theįirst 256 values represent the left channel, and the second 256 values represent The ByteArray object passed to the outputArray parameter is overwritten with the new values. The values are formatted as normalized floating-point values, in the range -1.0 to 1.0. Takes a snapshot of the current sound wave and places it into the specified ByteArray object. Public static function computeSpectrum(outputArray: ByteArray, FFTMode: Boolean = false, stretchFactor: int = 0): void Implementation public static function get useSpeakerphoneForVoice(): Boolean public static function set useSpeakerphoneForVoice(value: Boolean): void The underlying device setting at any time. Other applications running on the device can change In the AIR application descriptor or changing this value has no effect. Note On Android, you must set the _AUDIO_SETTINGS You cannot set useSpeakerphoneForVoice=true. Note On iOS, if your application has set audioPlaybackMode=VOICE and another application is also playing in voice mode, Has no effect in modes other than AudioPlaybackMode.VOICE. The default output so that you can implement a speakerphone button in a phone application. The useSpeakerphoneForVoice property lets you override Output when dioPlaybackMode is set toĪudioPlaybackMode.VOICE. By default, smartphones use the phone earpiece for audio Toggles the speakerphone when the device is in voice mode. Implementation public static function get bufferTime(): int public static function set bufferTime(value: int): void That is passed to the Sound.load() method. Or set the default of the buffer time specified in the SoundLoaderContext object The value of SoundMixer.bufferTime cannot override Sound objects (that is, Sound objects created in ActionScript). The SoundMixer.bufferTime property only affects the buffer timeįor embedded streaming sounds in a SWF and is independent of dynamically created
In an AIR application, code can access data in sound files from any source. Unless you implement a cross-domain policy file. The data in a loaded sound, including its buffer time,Ĭannot be accessed by code in a file that is in a different domain Unless you implement a cross-domain policy file.įor more information about security and sound, see the Sound class description. The data in a loaded sound, including its buffer time,Ĭannot be accessed by a SWF file that is in a different domain
The number of seconds to preload an embedded streaming sound into a buffer before it starts
Implementation public static function get audioPlaybackMode(): String public static function set audioPlaybackMode(value: String): void Throws The default value is AudioPlaybackMode.MEDIA. When you change audio play mode on iOS, mative apps playing music pause briefly. To allow other applications to play in media mode. Make the minimum usage of AudioPlaybackMode.VOICE mode, and try to switch toĪudioPlaybackMode.MEDIA mode as soon as you can after the voice call ends Other applications cannot change it to AudioPlaybackMode.MEDIA. Note On iOS, if one application sets audioPlaybackMode=AudioPlaybackMode.VOICE, Valid values for this property are defined in the AudioPlaybackMode class.
Inĭesktop and TV environments, no functional difference exists between audio playback modes. This property sets sound priorities and defaults according to platform idioms. Specifies the audio playback mode of all Sound objects.