Is SampleDataEvent implemented for Windows?

Hey, I just want to shed some light into this.
Of course you can substitute SampleDataEvent by loadPCMFromByteArray even though
this ain’t exactly the same.
Here’s an example utilizing SampleDataEvent, which generates a one second long
squarewave:

package;

import openfl.display.Sprite;
import openfl.media.Sound;
import openfl.events.SampleDataEvent;

class Main extends Sprite 
{
    private var SampleRate = 44100;
    private var ToneHertz = 256;
    private var ToneVolume = 16000;
    private var RunningSampleIndex=0;
    private var SquareWavePeriod=0;
    private var HalfSquareWavePeriod = 0;
    private var sound:Sound = new Sound();
    
    public function new() 
    {
        super();
        SquareWavePeriod = Std.int(SampleRate / ToneHertz);
        HalfSquareWavePeriod = Std.int(SquareWavePeriod / 2);
        
        sound.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
        sound.play();
    }
    
    private function onSampleData(event:SampleDataEvent):Void
    {
        for (i in 0...8192) 
        { 
            var SampleValue = (Std.int(RunningSampleIndex++ / HalfSquareWavePeriod) % 2 == 0) ? ToneVolume : -ToneVolume;
            event.data.writeFloat(SampleValue); 
            event.data.writeFloat(SampleValue); 
            if (RunningSampleIndex == 44100)
            {
                sound.removeEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
                break;
            }
        }
    }
}

If you take a look at the function onSampleData you’ll find
a for loop which increments up to 8192. This is actually our buffer of samples.
At a playback rate of 44100 hz this means we’re generating roughly 186ms of audio
which are written to a bytearray. (8192/44100*1000)
So whenever we’re running out of samples onSampleData will be called again and we
can do changes to the audio.

Now if we only want to generate a one second long squarewave we don’t really need
SampleDataEvent because we can generate the complete bytearray upfront and utilize
loadPCMFromByteArray.

Here’s another example:

package;

import openfl.display.Sprite;
import openfl.media.Sound;
import openfl.utils.ByteArray;

class Main extends Sprite 
{
    private var SampleRate = 44100;
    private var ToneHertz = 256;
    private var ToneVolume = 16000;
    private var RunningSampleIndex=0;
    private var SquareWavePeriod=0;
    private var HalfSquareWavePeriod = 0;
    private var sound:Sound = new Sound();
    private var bytes:ByteArray = new ByteArray();
    
    public function new() 
    {
        super();
        SquareWavePeriod = Std.int(SampleRate / ToneHertz);
        HalfSquareWavePeriod = Std.int(SquareWavePeriod / 2);
        
        for (i in 0...44100) 
        { 
            var SampleValue = (Std.int(RunningSampleIndex++ / HalfSquareWavePeriod) % 2 == 0) ? ToneVolume : -ToneVolume;
            bytes.writeFloat(SampleValue); 
            bytes.writeFloat(SampleValue); 
        }        
        
        bytes.position = 0;
        sound.loadPCMFromByteArray( bytes, Std.int(bytes.length / 4 / 2), "float", true, 44100);
        sound.play();
    }
}

I personally use this:

Support JS / Flash / Lime

1 Like

@Starburst
That looks interesting. Maybe you can share a sample how to use this library to generate
a one second squarewave from an OpenFl project?

You don’t need the whole lib, just that file and then extends it and override the onSample methods and set your samples in the array, pretty simple

1 Like

can you give a sample of how to do it? because SampleDataEvent and loadPCMFromByteArray only works for flash, and not to mac and windows…

Part of the fun is figuring it out, just follow my instruction above it’s as easy

Thanks! this is working now flash, neko, mac, windows target, and it dont have delay! lnl
Openfl 5.1.5
lime 5.2.1

Main.hx

package;

import openfl.display.Sprite;
import openfl.Lib;


class Main extends Sprite 
{

	public function new() 
	{
		super();
		
		var player = new PlayerNC();
	}

}

PlayerNC.hx

package;
import haxe.io.Float32Array;

class PlayerNC extends NativeChannel
{
	private var SampleRate = 44100;
    private var ToneHertz = 256;
    private var ToneVolume = 16000;
    private var RunningSampleIndex=0;
    private var SquareWavePeriod=0;
    private var HalfSquareWavePeriod = 0;

	public function new() 
	{
		super(8192);
		
		SquareWavePeriod = Std.int(SampleRate / ToneHertz);
        HalfSquareWavePeriod = Std.int(SquareWavePeriod / 2);
	}
	
	override function onSample(out:haxe.io.Float32Array) 
	{
		for (i in 0...out.length)
		{
			var SampleValue = (Std.int(RunningSampleIndex++ / HalfSquareWavePeriod) % 2 == 0) ? ToneVolume : -ToneVolume;
			out[i] = SampleValue;
		}
	}
	
}

NativeChannel.hx

package ;

#if lime_openal
import lime.media.openal.AL;
import lime.media.openal.ALBuffer;
import lime.media.openal.ALSource;

private class ALChannel {
	var native : NativeChannel;
	var samples : Int;

	var buffers : Array<ALBuffer>;
	var src : ALSource;

	var fbuf : haxe.io.Bytes;
	var ibuf : haxe.io.Bytes;
	var iview : lime.utils.ArrayBufferView;

	public function new(samples, native){
		this.native = native;
		this.samples = samples;
		buffers = AL.genBuffers(2);
		src = AL.genSource();
		AL.sourcef(src,AL.PITCH,1.0);
		AL.sourcef(src,AL.GAIN,1.0);
		fbuf = haxe.io.Bytes.alloc( samples<<3 );
		ibuf = haxe.io.Bytes.alloc( samples<<2 );
		iview = new lime.utils.Int16Array(ibuf);

		for ( b in buffers )
			onSample(b);
		forcePlay();
		lime.app.Application.current.onUpdate.add( onUpdate );
	}

	public function stop() {
		if ( src != null ){
			lime.app.Application.current.onUpdate.remove( onUpdate );

			AL.sourceStop(src);
			AL.deleteSource(src);
			AL.deleteBuffers(buffers);
			src = null;
			buffers = null;
		}
	}

	@:noDebug function onSample( buf : ALBuffer ) {
		@:privateAccess native.onSample(haxe.io.Float32Array.fromBytes(fbuf));

		// Convert Float32 to Int16
		#if cpp
		var fb = fbuf.getData();
		var ib = ibuf.getData();
		for( i in 0...samples<<1 )
			untyped __global__.__hxcpp_memory_set_i16( ib, i<<1, __global__.__int__(__global__.__hxcpp_memory_get_float( fb, i<<2 ) * 0x7FFF) );
		#else
		for ( i in 0...samples << 1 ) {
			var v = Std.int(fbuf.getFloat(i << 2) * 0x7FFF);
			ibuf.set( i<<1, v );
			ibuf.set( (i<<1) + 1, v>>>8 );
		}
		#end

		AL.bufferData(buf, AL.FORMAT_STEREO16, iview, ibuf.length, 44100);
		AL.sourceQueueBuffers(src, 1, [buf]);
	}

	inline function forcePlay() {
		if( AL.getSourcei(src,AL.SOURCE_STATE) != AL.PLAYING )
			AL.sourcePlay(src);
	}

	function onUpdate( i : Int ){
		var r = AL.getSourcei(src,AL.BUFFERS_PROCESSED);
		if( r > 0 ){
			for( b in AL.sourceUnqueueBuffers(src,r) )
				onSample(b);
			forcePlay();
		}
	}
}
#end

class NativeChannel {

	#if flash
	var snd : flash.media.Sound;
	var channel : flash.media.SoundChannel;
	#elseif js
	static var ctx : js.html.audio.AudioContext;
	static function getContext() {
		if( ctx == null ) {
			try {
				ctx = new js.html.audio.AudioContext();
			} catch( e : Dynamic ) try {
				ctx = untyped __js__('new window.webkitAudioContext()');
			} catch( e : Dynamic ) {
				ctx = null;
			}
		}
		return ctx;
	}
	var sproc : js.html.audio.ScriptProcessorNode;
	var tmpBuffer : haxe.io.Float32Array;
	#elseif lime_openal
	var channel : ALChannel;
	#end
	public var bufferSamples(default, null) : Int;

	public function new( bufferSamples : Int ) {
		this.bufferSamples = bufferSamples;
		#if flash
		snd = new flash.media.Sound();
		snd.addEventListener(flash.events.SampleDataEvent.SAMPLE_DATA, onFlashSample);
		channel = snd.play(0, 0x7FFFFFFF);
		#elseif js
		var ctx = getContext();
		if( ctx == null ) return;
		sproc = ctx.createScriptProcessor(bufferSamples, 2, 2);
		tmpBuffer = new haxe.io.Float32Array(bufferSamples * 2);
		sproc.connect(ctx.destination);
		sproc.onaudioprocess = onJsSample;
		#elseif lime_openal
		channel = new ALChannel(bufferSamples, this);
		#end
	}

	#if flash
	function onFlashSample( event : flash.events.SampleDataEvent ) {
		var buf = event.data;
		buf.length = bufferSamples * 2 * 4;
		buf.position = 0;
		onSample(haxe.io.Float32Array.fromBytes(haxe.io.Bytes.ofData(buf)));
		buf.position = bufferSamples * 2 * 4;
	}
	#end

	#if js
	function onJsSample( event : js.html.audio.AudioProcessingEvent ) {
		onSample(tmpBuffer);
		// split the channels and copy to output
		var r = 0;
		var left = event.outputBuffer.getChannelData(0);
		var right = event.outputBuffer.getChannelData(1);
		for( i in 0...bufferSamples ) {
			left[i] = tmpBuffer[r++];
			right[i] = tmpBuffer[r++];
		}
	}
	#end

	function onSample( out : haxe.io.Float32Array ) {
	}

	public function stop() {
		#if flash
		if( channel != null ) {
			channel.stop();
			channel = null;
		}
		#elseif js
		if( sproc != null ) {
			sproc.disconnect();
			sproc = null;
		}
		#elseif lime_openal
		if( channel != null ) {
			channel.stop();
			channel = null;
		}
		#end
	}

}

We do support loadPCMFromByteArray nowadays, but let me know if anyone has problems with it

Any update on SampleDataEvent on openfl 8 ?

Not included, but now that things are quieting down a little, I hope for there to be more time. Ideally, we’d come up with an API that makes sense on the Lime side, then consume that from OpenFL (rather than making platform-specific calls) but either way, I understand its better to have it working than not :slight_smile:

1 Like

Just a little reminder that I made a pull request some time ago.
I understand that it surely would better fit in the Lime layer but - as you already implied - it’s better than nothing. :grinning:
Well, it worked flawlessly back then but since I don’t know what has changed in OpenFL meanwhile I’m not too sure if it still works.

2 Likes

in what version of lime openfl it works? i’m working in a old version, maybe it will work for me?

This isn’t ideal because of course ideally SampleDataEvent should be implemented. It would help me a lot if it was. But here’s an example I made for one of my libs demonstrating a synth working on top of openal using NativeChannel from heaps:

The key file is here:

Alas, I can’t remember what version it was. :wink:
But I have good news, it still works with OpenFL 8.0.1! There’s just one line that needs to be changed:

We need to change it to this:

if (SoundMixer.__soundChannels.length >= SoundMixer.MAX_ACTIVE_CHANNELS) {

Reason is simple - if we just use the Sound object to listen for a SampleDataEvent, __buffer will be null upon initialization and the events won’t ever get fired.

1 Like

This pattern is the music we hear? How do you lear to do that? i mean to compose inthat array the music?:

		patterns.push(["g-2|g-4", "", "", "a-4", "a#4", "", "d-2|a-4", "", "g-4", "", "f-4", "", "g-2|g-4", "", "d-4", "", "", "", "", "", "", "", "d-4", "", "g-2|g-4", "", "", "a-4", "a#4", "", "c-2|c-5", "", "a#4", "", "c-5", "", "d-2|d-5", "", "", "", "", "", "", "", "", "", "d-5", ""]);

finally have time to test your changes, and is working just fine!

It’s a bit abstract, isn’t it? :wink:
I’ve been using arrays to create music since the glory days of my TI-99/4a, so I’m quite
used to it. However, since the whole sequencer in this example is heavily based on
the .mod format, I’ve used a tracker to compose the melody and merely wrote down the notes
into the array afterwards.

1 Like

“a#4” this is a note that i understand
"" this is silence maybe?
g-2|g-4 that i dont understand

Where goes the tempo?

i’ve also tested the changes proposed by obscure, seem to be working really well. any chance of integrating them in the release?

1 Like

This is working real great now!

video

Any one knows how to make envelope for the whitenoise/percusion ?