I’ve tried to use SampleDataEvent several times over the years, but never got it to work with anything other than flash. I just tried with the newest version 5.1.3, but still no sound in Windows or html5, only in flash.
I’m not able to try neko, I always get:
Exception : Neko_error(The bootable executable file was not found : neko.exe)
and neko’s folder is in the PATH.
I suspect you’re right about the SampleDataEvent, but I’m more worried about your Neko woes, have you tried a fresh installation to resolve your issues with Neko? I find it invaluable as a very fast testing environment for many projects, so it’d be nice if your Neko was working “Out of the box”.
That’s true, it’s implemented for native, but I forgot that it’s not implemented for HTML5 at the moment (probably would need some custom Web Audio code)
Hi Wiering, how loadPCMFromByteArray will replace SampleDataEvent?
for example in this:
package;
import openfl.display.Sprite;
import openfl.media.Sound;
import openfl.events.SampleDataEvent;
class Main extends Sprite
{
var SampleRate = 44100;
var ToneHertz = 256;
var ToneVolume = 16000;
var RunningSampleIndex=0;
var SquareWavePeriod=0;
var HalfSquareWavePeriod = 0;
public function new()
{
super();
SquareWavePeriod = Std.int(SampleRate / ToneHertz);
HalfSquareWavePeriod = Std.int(SquareWavePeriod / 2);
var sound:Sound = new Sound();
sound.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
sound.play();
}
private function onSampleData(event:SampleDataEvent):Void
{
for (i in 0...8192)
{
var SampleValue = (Std.int(RunningSampleIndex++ / HalfSquareWavePeriod) % 2 == 0) ? ToneVolume : -ToneVolume;
event.data.writeFloat(SampleValue);
event.data.writeFloat(SampleValue);
}
}
}
Sorry, I wasn’t able to get it to work. I eventually decided to create sound samples in all needed pitches in Audacity and include them all in the game, not really a nice solution but it works.
Hey, I just want to shed some light into this.
Of course you can substitute SampleDataEvent by loadPCMFromByteArray even though
this ain’t exactly the same.
Here’s an example utilizing SampleDataEvent, which generates a one second long
squarewave:
package;
import openfl.display.Sprite;
import openfl.media.Sound;
import openfl.events.SampleDataEvent;
class Main extends Sprite
{
private var SampleRate = 44100;
private var ToneHertz = 256;
private var ToneVolume = 16000;
private var RunningSampleIndex=0;
private var SquareWavePeriod=0;
private var HalfSquareWavePeriod = 0;
private var sound:Sound = new Sound();
public function new()
{
super();
SquareWavePeriod = Std.int(SampleRate / ToneHertz);
HalfSquareWavePeriod = Std.int(SquareWavePeriod / 2);
sound.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
sound.play();
}
private function onSampleData(event:SampleDataEvent):Void
{
for (i in 0...8192)
{
var SampleValue = (Std.int(RunningSampleIndex++ / HalfSquareWavePeriod) % 2 == 0) ? ToneVolume : -ToneVolume;
event.data.writeFloat(SampleValue);
event.data.writeFloat(SampleValue);
if (RunningSampleIndex == 44100)
{
sound.removeEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
break;
}
}
}
}
If you take a look at the function onSampleData you’ll find
a for loop which increments up to 8192. This is actually our buffer of samples.
At a playback rate of 44100 hz this means we’re generating roughly 186ms of audio
which are written to a bytearray. (8192/44100*1000)
So whenever we’re running out of samples onSampleData will be called again and we
can do changes to the audio.
Now if we only want to generate a one second long squarewave we don’t really need
SampleDataEvent because we can generate the complete bytearray upfront and utilize
loadPCMFromByteArray.
Here’s another example:
package;
import openfl.display.Sprite;
import openfl.media.Sound;
import openfl.utils.ByteArray;
class Main extends Sprite
{
private var SampleRate = 44100;
private var ToneHertz = 256;
private var ToneVolume = 16000;
private var RunningSampleIndex=0;
private var SquareWavePeriod=0;
private var HalfSquareWavePeriod = 0;
private var sound:Sound = new Sound();
private var bytes:ByteArray = new ByteArray();
public function new()
{
super();
SquareWavePeriod = Std.int(SampleRate / ToneHertz);
HalfSquareWavePeriod = Std.int(SquareWavePeriod / 2);
for (i in 0...44100)
{
var SampleValue = (Std.int(RunningSampleIndex++ / HalfSquareWavePeriod) % 2 == 0) ? ToneVolume : -ToneVolume;
bytes.writeFloat(SampleValue);
bytes.writeFloat(SampleValue);
}
bytes.position = 0;
sound.loadPCMFromByteArray( bytes, Std.int(bytes.length / 4 / 2), "float", true, 44100);
sound.play();
}
}
Thanks! this is working now flash, neko, mac, windows target, and it dont have delay! lnl
Openfl 5.1.5
lime 5.2.1
Main.hx
package;
import openfl.display.Sprite;
import openfl.Lib;
class Main extends Sprite
{
public function new()
{
super();
var player = new PlayerNC();
}
}
PlayerNC.hx
package;
import haxe.io.Float32Array;
class PlayerNC extends NativeChannel
{
private var SampleRate = 44100;
private var ToneHertz = 256;
private var ToneVolume = 16000;
private var RunningSampleIndex=0;
private var SquareWavePeriod=0;
private var HalfSquareWavePeriod = 0;
public function new()
{
super(8192);
SquareWavePeriod = Std.int(SampleRate / ToneHertz);
HalfSquareWavePeriod = Std.int(SquareWavePeriod / 2);
}
override function onSample(out:haxe.io.Float32Array)
{
for (i in 0...out.length)
{
var SampleValue = (Std.int(RunningSampleIndex++ / HalfSquareWavePeriod) % 2 == 0) ? ToneVolume : -ToneVolume;
out[i] = SampleValue;
}
}
}
NativeChannel.hx
package ;
#if lime_openal
import lime.media.openal.AL;
import lime.media.openal.ALBuffer;
import lime.media.openal.ALSource;
private class ALChannel {
var native : NativeChannel;
var samples : Int;
var buffers : Array<ALBuffer>;
var src : ALSource;
var fbuf : haxe.io.Bytes;
var ibuf : haxe.io.Bytes;
var iview : lime.utils.ArrayBufferView;
public function new(samples, native){
this.native = native;
this.samples = samples;
buffers = AL.genBuffers(2);
src = AL.genSource();
AL.sourcef(src,AL.PITCH,1.0);
AL.sourcef(src,AL.GAIN,1.0);
fbuf = haxe.io.Bytes.alloc( samples<<3 );
ibuf = haxe.io.Bytes.alloc( samples<<2 );
iview = new lime.utils.Int16Array(ibuf);
for ( b in buffers )
onSample(b);
forcePlay();
lime.app.Application.current.onUpdate.add( onUpdate );
}
public function stop() {
if ( src != null ){
lime.app.Application.current.onUpdate.remove( onUpdate );
AL.sourceStop(src);
AL.deleteSource(src);
AL.deleteBuffers(buffers);
src = null;
buffers = null;
}
}
@:noDebug function onSample( buf : ALBuffer ) {
@:privateAccess native.onSample(haxe.io.Float32Array.fromBytes(fbuf));
// Convert Float32 to Int16
#if cpp
var fb = fbuf.getData();
var ib = ibuf.getData();
for( i in 0...samples<<1 )
untyped __global__.__hxcpp_memory_set_i16( ib, i<<1, __global__.__int__(__global__.__hxcpp_memory_get_float( fb, i<<2 ) * 0x7FFF) );
#else
for ( i in 0...samples << 1 ) {
var v = Std.int(fbuf.getFloat(i << 2) * 0x7FFF);
ibuf.set( i<<1, v );
ibuf.set( (i<<1) + 1, v>>>8 );
}
#end
AL.bufferData(buf, AL.FORMAT_STEREO16, iview, ibuf.length, 44100);
AL.sourceQueueBuffers(src, 1, [buf]);
}
inline function forcePlay() {
if( AL.getSourcei(src,AL.SOURCE_STATE) != AL.PLAYING )
AL.sourcePlay(src);
}
function onUpdate( i : Int ){
var r = AL.getSourcei(src,AL.BUFFERS_PROCESSED);
if( r > 0 ){
for( b in AL.sourceUnqueueBuffers(src,r) )
onSample(b);
forcePlay();
}
}
}
#end
class NativeChannel {
#if flash
var snd : flash.media.Sound;
var channel : flash.media.SoundChannel;
#elseif js
static var ctx : js.html.audio.AudioContext;
static function getContext() {
if( ctx == null ) {
try {
ctx = new js.html.audio.AudioContext();
} catch( e : Dynamic ) try {
ctx = untyped __js__('new window.webkitAudioContext()');
} catch( e : Dynamic ) {
ctx = null;
}
}
return ctx;
}
var sproc : js.html.audio.ScriptProcessorNode;
var tmpBuffer : haxe.io.Float32Array;
#elseif lime_openal
var channel : ALChannel;
#end
public var bufferSamples(default, null) : Int;
public function new( bufferSamples : Int ) {
this.bufferSamples = bufferSamples;
#if flash
snd = new flash.media.Sound();
snd.addEventListener(flash.events.SampleDataEvent.SAMPLE_DATA, onFlashSample);
channel = snd.play(0, 0x7FFFFFFF);
#elseif js
var ctx = getContext();
if( ctx == null ) return;
sproc = ctx.createScriptProcessor(bufferSamples, 2, 2);
tmpBuffer = new haxe.io.Float32Array(bufferSamples * 2);
sproc.connect(ctx.destination);
sproc.onaudioprocess = onJsSample;
#elseif lime_openal
channel = new ALChannel(bufferSamples, this);
#end
}
#if flash
function onFlashSample( event : flash.events.SampleDataEvent ) {
var buf = event.data;
buf.length = bufferSamples * 2 * 4;
buf.position = 0;
onSample(haxe.io.Float32Array.fromBytes(haxe.io.Bytes.ofData(buf)));
buf.position = bufferSamples * 2 * 4;
}
#end
#if js
function onJsSample( event : js.html.audio.AudioProcessingEvent ) {
onSample(tmpBuffer);
// split the channels and copy to output
var r = 0;
var left = event.outputBuffer.getChannelData(0);
var right = event.outputBuffer.getChannelData(1);
for( i in 0...bufferSamples ) {
left[i] = tmpBuffer[r++];
right[i] = tmpBuffer[r++];
}
}
#end
function onSample( out : haxe.io.Float32Array ) {
}
public function stop() {
#if flash
if( channel != null ) {
channel.stop();
channel = null;
}
#elseif js
if( sproc != null ) {
sproc.disconnect();
sproc = null;
}
#elseif lime_openal
if( channel != null ) {
channel.stop();
channel = null;
}
#end
}
}
Not included, but now that things are quieting down a little, I hope for there to be more time. Ideally, we’d come up with an API that makes sense on the Lime side, then consume that from OpenFL (rather than making platform-specific calls) but either way, I understand its better to have it working than not
Just a little reminder that I made a pull request some time ago.
I understand that it surely would better fit in the Lime layer but - as you already implied - it’s better than nothing.
Well, it worked flawlessly back then but since I don’t know what has changed in OpenFL meanwhile I’m not too sure if it still works.