Openfl.media.Video

@Retard Sorry for the slow reply!

I wondered whether you’d had more progress? If I was to add this to the latest Lime what are the things I would need to transfer across?

@Retard every time after I run a video, the renderer becomes quite scrambled. Is there a way to unload the video and run it and safely return to the application? Thanks

Hi! I’ ve manage to do some progress in videos (got working video with sound almost synced), but problem is - as You pointed - renderer error after playback. It seems that renderer is damaged after first call to RenderCopy, when texture is in YUV format, and I’ve tried to tackle it in various ways without any success (creating new renderers, switching renderers etc.)… Which is weird, because in normal c++ app SDL renderer is working - movie is ended and app is being able to show textures in other pixel formats :confused:

Here are sources, maybe someone could fix this error ^^ If not, there are OGV cpp and h files, they could be used in other apps with SDL2 and OpenAL:
–link is in next post–

Here is working app in neko (far quicker to compile, and it is displaying move and shows the problem -> lime texture should be visible after movie ends):

And here is just cpp version of movie player sources with compiled exe to check that after movie is ended bitmap is displayed correctly in cpp:

#edit
Well, it is not working on linux :confused: renderer is badly rendered from the start, and app crashes after movie is finished but I do not have a clue where error could be.

Generally drawing functions in sdl needs a pointer to texture and renderer. I got renderer pointer from function I added to lime/graphics/Renderer.hx:

	public function getBackend ():Dynamic {
		
		return backend.handle;
		
	}

I assume it is a pointer to SDLRenderer object from lime.ndll.
It is passed as Dynamic to lime, casted to void* and passed to OGV constructor, and finally pointers to SDL_Renderer and SDL_Window are extracted from it (SDL_Windows is needed for setting proper scale boundaries). Window dimension are readed properly, so I think pointers are correct.
In ExternalInterface I added two functions - lime_video_load and lime_video_load_frame.
If somebody could check it/help, I would be appreciated :wink:
cheers:)

Hi!

That’s great you’ve been making progress on this! I’m still very interested in this as well so I’m pleased that you are having some success.

I found that after I returned from a video the pixels were all offset so with some help I was able to reset

				renderSession.gl.depthRange(0, 1);
				renderSession.gl.clearDepth(1);
				renderSession.gl.depthMask(true);
				renderSession.gl.colorMask(true,true,true,true);
				renderSession.gl.disable(renderSession.gl.DEPTH_TEST);
				renderSession.gl.frontFace(renderSession.gl.CCW);
				renderSession.gl.enable( renderSession.gl.BLEND );
				renderSession.gl.blendFunc(renderSession.gl.SRC_ALPHA, renderSession.gl.ONE_MINUS_SRC_ALPHA);
				renderSession.gl.disable(renderSession.gl.CULL_FACE);
				renderSession.gl.disable(renderSession.gl.SCISSOR_TEST);
				renderSession.gl.disable(renderSession.gl.SAMPLE_ALPHA_TO_COVERAGE);
				if ( renderSession.gl.MAX_TEXTURE_IMAGE_UNITS == 0) renderSession.gl.MAX_TEXTURE_IMAGE_UNITS = renderSession.gl.getParameter(renderSession.gl.MAX_TEXTURE_IMAGE_UNITS);
				
				for (i in 0...renderSession.gl.MAX_TEXTURE_IMAGE_UNITS) {
					renderSession.gl.activeTexture(renderSession.gl.TEXTURE0 + i);
					renderSession.gl.bindTexture(renderSession.gl.TEXTURE_CUBE_MAP, null);
					renderSession.gl.bindTexture(renderSession.gl.TEXTURE_2D, null);
				}
				renderSession.gl.bindBuffer(renderSession.gl.ARRAY_BUFFER, null);
				renderSession.gl.bindBuffer(renderSession.gl.ELEMENT_ARRAY_BUFFER, null);
				renderSession.gl.clear(1);
				renderSession.gl.clearColor(1, 1, 1, 1);
				
				renderSession.gl.pixelStorei(renderSession.gl.UNPACK_ALIGNMENT, 0);
				renderSession.gl.pixelStorei(0x0CF2, 0);
				
				
				renderSession.gl.useProgram(null); `

Very messy I know! But it solved the problem and I was able to return to game rendering after that.

Is that helpful for your issue?

I don’t have a Linux environment yet so can’t offer anything useful in regards to that at this stage.

Hi!
I’ve been able to do a linux working version and it seems that You have found a solution to graphic messup.
On lime, adding just

renderSession.gl.pixelStorei(0x0CF2, 0);

after movie fixes graphic (in SimpleImage source, when OpenGL is being setup). Could You check this? There are new lime sources with modified lime SimpleImage sources (shows video, and then lime.png file)?

Generally I’m still working on this, but for now changes are following:

  1. added few functions in project/src/backend/sdl/sdlRenderer.cpp and sdlRenderer.h (added new SDLTexture to class and methods to init/deinit/update it)
  2. added 2 functions in project/src/ExternalInterface.cpp (exposing functions from project/src/video/OGV.cpp
  3. added folder project/src/video and files inside (cpp theora decoding functions)
  4. added few theora related lines in project/build.xml (location of OGV.cpp and theora decoder lib files)
  5. added folder lime/video and file inside (haxe connection with lime.ndll)
  6. edited lime/graphics/Renderer.hx - added getBackend function (OGV class need this pointer)
  7. added project/lib/theora with files (theora decoder lib)

Sorry about the delay! When testing this on mac I received the following error when doing lime rebuild mac

Do you know what might be causing this?

It seems that I’m linking with my own cpp SDL (and probably OpenAL) headers instead of those in project folder. But I run into a problem I cannot solve - generally it seems that openfl doesn’t work with SDL textures (I couldn’t find any linkage with lime.ndll functions that operates on SDL Textures and updates SDLRenderer :confused: Generally I hardly understand anything from graphic part of openfl :smiley: )
I think I could extract pointers to yuv planes for implementing shader functionality, but I don’t know shaders yet. So for now I’ll leave it again :confused:

And for error -> try cutting SDL2, and leave <SDL.h> in OGV.h

From the other hand - do You know a method of fast swapping rgba pixels of OpenFl Sprite? Could this be done easily (or - could it be done at all?) … ?

Hey, just checking in to say I’m still here, and very excited about progress.

Please keep us up to date, and thank you very much for your efforts!

1 Like

jeah swaping pixels on a sprite can be done and seems to be reasonable in terms of performance. This post should give you an idea how to do it. Up till now this has been working fine for me playing back videos.

Hi, sorry for not responding for so long. I think I guess I’m starting to nail the problem down :wink:
Here is link to zip file (zipped under linux, so on Windows You could open it with 7-zip probably, not the system zip):


–edited:
here was a link to a version in which openfl was in openfl/3.6.1/3.6.1 dir(double 3.6.1). fixed in new link
–end

in lime folder there is my full lime source
openfl - full openfl source
pong folder -> first few lessons from haxecoder pong intro, with video usage.

Generally video works (at least on my machine), so if You’ll be able to setup folders correctly (haxelib could find lime and openfl folder from this zip), compile lime (on my machine it works, but I fixed a <SDL/sdl.h> missing file also) and compile test app, it should play Big Buck Bunny.

Generally there is plenty of stuff to do, like switch opening movie from FILE* to lime::file in OGV.cpp (which maps to SDL_RWops* , so it could work on android also), removing duplicated music code (I’m using vorbisfile to manage video music -> it uses intrinsics if it’s possible, so it’s faster than manually switching floats to chars while filling OpenAl audio buffers), handling decoupled code (playing video need to know about SDL_Renderer, music needs to know about OpenAL, etc), and linking c with haxe.

I avoided copying data into sprite by extending Sprite itself, and overriding __renderGL method -> if video is added to stage, it should be handled automatically. But there should be nothing else on stage - if there is, video will be hidden (?!).

If You’ll have trouble with compiling this, I’ll try to help :slight_smile:

2 Likes

Hmm, anyone tried to compile lime? Any errors with inclusion files?

Paging @JayMaxArmstrong

Ymm… By this You mean that You have the same problem as @JayMaxArmstrong with sdl include file?

Ah, and In zip there is lime 2.9.1 and openfl 3.6.1.

Nono, I just can’t test this right now and was calling him to the thread as he’s interested too :slight_smile:

I’ve started a git repo for this.


It could be easier to check code this way…

Version on git has decoupled renderer and audio from videodecoder -> but at the cost of just decoding video now, without frame time checking and without sound :confused:

2 Likes

I’m sorry to bring this up again so much time later, but did you get webm working with openfl next? I’ve managed good quality/memory/performance by removing sound code from the cpp interface (which contained a memory leak, afaik), but on legacy, and can’t get it to do properly on -Dnext (using openfl 3.6.1 because flixel :confused: ).

(Also paging @singmajesty because this seems to be reaching a level of maturity that might interest him)

For anyone googling this thread, this is my fork of the lib.

1 Like

I’ve updated repo, test project with movie should be played with correct time, with proper scaling, but without voice :confused:

I manage to test it on Windows - got green screen; seems that it works just for linux now (maybe mac also).
And probably the only option would be to do the OpenGL shader stuff, becouse SDL draw texture dont work on win :confused:

#####edit:

@MikeEvmm there is libyuv in webm packet, maybe You could use it in your library tests, it could speedup conversion to rgb (but as far as I remember I got poor performance from this library, poorer than from SDL draving YUV texture directly). Other option is, like Joshua suggested - get yuv planes and strides into haxe and draw it with shader (but it could be also time eater - for 1000x1000 px there will be 6mb1,5mb of data passed to gpu every frame :confused: generally in e.g. vlc player, code for handling yuv->rgb conversion works on cpu).
And, from the other hand, how do You handle audio? AFAIK container for webm is simplified version of matroska, and every lib I could find for handle this were GPL’ed :confused:
I saw You are including ogg headers, so You are decoding sound. Could You share how are You extracting music from webm videofile?

And I got windows version working - I got errors in tools.n file (“lime manager” is uploading bad libraries into bin directory. I messed up my project hx files :smirk: and it will take time to clean it up (now I dunno what’s happening there :frowning:) .

2 Likes

I’m not decoding sound at all! In fact, I know very little about video decoding and cpp extentions – I just removed all code related to sound in the webm library @soywiz made to stop a memory leak that was hiding somewhere. (Since audio decoding was not working anyway)

Regarding YUV to RGB conversion, I see no reason to convert to RGB; to write YUV directly using SDL seems like the best option.

Ah Yes, You wrote that before :slight_smile:
Okay, in original lib there’s header for mkv decoder with proper links, thank You;)

#####edit:
I made changes to windows version, so hopefully it should work. unfortunately first thing after cloning should be run lime/tools.hxml file for now.

#####edit:
Windows VS2010 now works -> I compiled lime and testapp with visual studio EE 2010 32 bit, usind SDL_textures. Movie is shown, but as You could see -> while changing window size opengl context is frozen, and it looks crappy.
Mingw sometimes work (but I think it depends of MinGW version being used -> some versions broke debugger even on simple int main(){printf("%d",1);return 0;} , so its a lottery I guess), android “should” open movie file, but it does not render (logcat shows movie values correctly - width and height).
Linux works for 64 and 32 bits.

1 Like