Main Menu
Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - tjm

#1
Paul,

I just emailed you a working solution for streaming sources, then realized the methods and variables are missnamed! Calculations are returned as seconds, not milliseconds.

It hasn't been put through rigorous testing, but initial tests show it returns elapsed play time of accuracy of about 10ms for short audio files (10 seconds or less), a few milliseconds for long audio files (5 minutes). The inaccuracy is probably due to rounding errors.

--Tim.

#2
Quote
Quote
Quote
1) the buffers appear to be continuously queued and unqueued even if a source is not playing
But doesn't OpenAL mark all streaming buffers as 'processed' regardless of the actual processed state? Pretty sure I read that it does in the OpenAL programmer's guide.
Possibly, but the result of that would be for OpenAL to report that it is further ahead in the stream than it actually is.  It wouldn't explain buffers being continuously queued and unqueued whenever a source is not playing.  When a streaming source is paused or stopped or hasn't been played yet, there should not be any data being read in or buffers being created.  I haven't had time to look at this yet, but it sounds like a bug to me.

This one is solved ... it's user error :-[

I have two versions of the same WAV file, one padded with silence and one without silence. I was loading from the package that contains the silence-padded file.

--Tim.
#3
Quote1) the buffers appear to be continuously queued and unqueued even if a source is not playing,
Yes, a SoundSystem streaming source.

If my understanding of SoundSystem is correct, once a OpenAL processes buffers for a SoundSystem streaming sources, those OpenAL buffers are unqueued, refilled, then requeued again by the SoundSystem Channel.

But doesn't OpenAL mark all streaming buffers as 'processed' regardless of the actual processed state? Pretty sure I read that it does in the OpenAL programmer's guide.

QuoteBuffers are only trimmed once per clip (when the end of the data is reached)
I was thinking more along the lines of the WavCodec trimming it's notion of a buffer, which isn't necessarily the same size as the OpenAL buffer. The result of that would be this OpenAL buffer isn't 100% full, so this call

float bufSize = (float)AL10.alGetBufferi(intBuffer.get(0), AL10.AL_SIZE);

doesn't accurately reflect the audio data in the buffer, hence the cumulative errors I'm seeing.

But I'm just speculating 'cause I really don't know.

Quote
Quote
there's a fair bit of latency between the source starting to play and getting a reference to the source object via the library
This is the result of two things.  Yes, one is multi-threading
Thanks for that ..... good to know. Not sure if that's a show-stopper for me yet.

All in all, I think the SoundSystem is a pretty good framework. I'm just tracking method calls making guesses, trying to get this to work ..... and really have no clear idea what I'm doing.

--Tim.
#4
Thanks Paul. I have something sort of working but there are problems, which could largely be a result of me not really understanding SoundSystem or OpenAL.

Anyway, the major problems encountered so far are 1) the buffers appear to be continuously queued and unqueued even if a source is not playing, 2) my 'played millis' calculationsa are faster than actual playback by around 30% (are the buffers 100% full? or is this an artifact of the WAV codec trimming buffers? maybe just bad math?) and finally 3) there's a fair bit of latency between the source starting to play and getting a reference to the source object via the library (around 300-500 millis .... an artifact of the multi-threading?)

I'll keep plugging away at it, but my feeling at this point is that my current approach may not work.

--Tim.
#5
Think I have a workable solution coded, but not tested. There are 2 parts to it.

First part .... modified method ChannelLWJGLOpenAL.queueBuffer

    public boolean queueBuffer( byte[] buffer )
    {
        // Stream buffers can only be queued for streaming sources:
        if( errorCheck( channelType != SoundSystemConfig.TYPE_STREAMING,
                        "Buffers may only be queued for streaming sources." ) )
            return false;

        ByteBuffer byteBuffer = ByteBuffer.wrap( buffer, 0, buffer.length );

        IntBuffer intBuffer = BufferUtils.createIntBuffer( 1 );

        AL10.alSourceUnqueueBuffers( ALSource.get( 0 ), intBuffer );
        if( checkALError() )
            return false;

// TJM -- based on concepts from: http://kcat.strangesoft.net/alffmpeg.c
//get size of unqueued buffer,
//calc number of samples it contains,
//calc milli duration of num samples in buffer
//increment position in millis relative to played buffers
//    using ALFormat and sampleRate
//    where ALFormat  is AL10.AL_FORMAT_MONO8  | AL10.AL_FORMAT_MONO16 |
//                               AL10.AL_FORMAT_STEREO8  | AL10.AL_FORMAT_STEREO16

float bufSize = AL10.alGetBufferf(intBuffer.get(0), AL10.AL_SIZE);

switch (ALFormat) {
case AL10.AL_FORMAT_MONO8 : millisPlayed += ( bufSize/8f ) / (float)sampleRate;
break;
case AL10.AL_FORMAT_MONO16 : millisPlayed += ( bufSize/16f ) / (float)sampleRate;
break;
case AL10.AL_FORMAT_STEREO8 : millisPlayed += ( bufSize/(8f*2f) ) / (float)sampleRate;
break;
case AL10.AL_FORMAT_STEREO16 : millisPlayed += ( bufSize/(16f*2f) ) / (float)sampleRate;
break;
default : break;

}

        AL10.alBufferData( intBuffer.get(0), ALformat, byteBuffer, sampleRate );
        if( checkALError() )
            return false;

        AL10.alSourceQueueBuffers( ALSource.get( 0 ), intBuffer );
        if( checkALError() )
            return false;

        return true;
    }



then the Second part .... a new method in Source

public int getMillisPlayed() {
return (-1);
}


and an overridden version of the method in SourceLWJGLOpenAL

public int getMillisPlayed() {

if( !toStream ) {
return (AL10.AL_INVALID);
}

       AudioFormat audioFormat = codec.getAudioFormat();

// get number of samples played in current buffer
int offset = AL11.alGetSourcei(channelOpenAL.ALSource.get( 0 ),
AL_SAMPLE_OFFSET);
// divide that by the sample rate to get duration in millis
offset = offset/channelOpenAL.sampleRate;

// add the Channel's  millisPlayed value
offset+= channelOpenAL.millisPlayed;

       // Return millis played:
       return millisPlayed;
   }



Think that'll do the trick ..... hopefully will try it out tomorrow.

--Tim.
#6
Paul,

I've been poking around the SoundSystem code this morning, doing a bit of OpenAL research, and it looks like calculating playback time for a streaming source should be fairly straightforward.

My initial thoughts are to add a method to the ChannelLWJGLOpenAL class, maybe something like 'playTime' that would rely AL11. alGetSourcei(source, AL_SAMPLE_OFFSET) to return actual playback time, calculated using the number of samples played and the sample rate.

Assuming the application can get a reference to the channel object, the application could simply poll the channel from the game loop.

Sound reasonable?

--Tim.
#7
QuoteI'll look into expanding on this idea to make it so events are sent out each time a buffer is added to the stream.  This information could be used to determine where you are in the stream periodically, and then use that information to calculate slight alterations to either the animation speed or the streaming source's pitch to realign the two.

Paul, that would be great!!  Could you post a URL for a 'preview' if the release isn't ready? I can poke around, look at the messaging you've implemented, and possibly add something for messages on buffer additions.

--Tim.
#8
Support / how to synch wav audio and 3D animation?
April 10, 2010, 02:20:54 PM
Hi all.

This is really a 'any ideas?' sort of post .....

What's the best approach would be for synching an audio track to an animatio?  A use case here would be lip synching an avatar to dialog.

The audio is WAV format, being played with Paul's SoundSystemJPCT, and the game loop and timing is derived from from the Slick2D libraries, and uses LWJGL's Sys class to generate deltas as milliseconds. It all works OK, but various latencies and system loads mean the audio and animation can get out of synch very easily.

If it helps, I'm using Acid Pro 7 to sequence and create the audio.

My thoughts at the moment are to create time markers, then

  • at runtime calculate the number of bytes to be played until the next marker,
  • count bytes
  • call a listener method

or to do the audio as MIDI, which would entail

  • implement a custom MIDI sequencer
  • intercept 'Wire' protocol data
  • call listeners for 'note on' and 'note off' messages

Does anyone have an other ideas?


On a slightly different topic, anyone interested in using a Wiimote for game input? If there's interest I'll post my sample code and details of setting it all up.

--Tim.
#9
Thanks for the help.  I was rotating the object, not the mesh. Clearly that has been causing the unexpected object behaviors.

Sorry about the delay in replying .... it's Spring break here and my 2 kids (ages 6 and 3) are home all week.
Not getting much done at all right now.

--Tim.
#10
This has been driving me crazy all weekend and all day!!

If an object is rotated 90 degrees around it's X axis (so the object's Y axis is now aligned with the world's Z axis ), when it's translated along the Z axis it actually moves along the world's Y axis ..... is this correct?

And as a follow on, after rotating an object how can it be re-aligned with the world's axis so that the rotation is not undone?

TIA,
Tim.
#11
Support / Re: units in world space?
March 26, 2010, 12:01:40 AM
Thanks. It sort of makes sense .... I'm just used to 2D coding where 1px = 1 unit.

Time to go play :)

--Tim.
#12
Support / units in world space?
March 25, 2010, 11:03:50 PM
A few more questions for the day ....


  • What are the units in world space?
  • What are they related to?
  • Is there a notion of scale or size .... how many units would be considered 'close,' 'far,' 'small,' and so on?

I download and started using jPCT only yesterday morning, and this is my first foray into 3D, so please forgive all the beginners questions all in a row.

TIA.

--Tim.
#13
Support / Re: transparency - wrong color
March 25, 2010, 10:35:02 PM
Thanks Paul and Egon .... fixed it ;D

Using a PNG and removing the TRANSPARENCY_MODE_ADD did the trick. The result is a bit ugly, but I'm prototyping core interactions and it will do nicely.

BTW, the code (above) is from one of Paul's posts. Thanks ... it would have taken me ages to figure out how to put a simple square into a world.

Here's the obligatory screen shot. Not so great, but shows it worked:



--Tim.
#14
Support / transparency - wrong color
March 25, 2010, 07:38:50 PM
Unfortunately for me, my hardware doesn't support shadows, so I'm trying to implement pseudo-shadows. It's going OK, but the transparency has got my a bit stumped. My texture is an all black JPG (made in Photoshop) with the opacity set to 50%

When the pseudo shadow is rendered, it's white, not black. This is probably something to do with Lights, but I can't figure it out. Any help appreciated!!

A screen grab and code are below. BTW, the program itself is basically a hacked up version of HelloWorld from the Wiki, with the World ambient light is 255,255,255, the code below is almost 100% taken from a forum post (sorry, can't remember who posted it).

--Tim.





private Object3D makePseudoShadow(float transZ) {
Object3D obj = new Object3D( 2 );
        float offset = 50;  // width / height of the billboard (assumes it is square)
        obj.addTriangle( new SimpleVector( -offset, -offset, 0 ),
                           0, 0,
                           new SimpleVector( -offset, offset, 0 ), 0, 1,
                           new SimpleVector( offset, offset, 0 ), 1, 1,
                           TextureManager.getInstance().getTextureID(
                                                             "shadow" ) );
        obj.addTriangle( new SimpleVector( offset, offset, 0 ),
                           1, 1,
                           new SimpleVector( offset, -offset, 0 ), 1, 0,
                           new SimpleVector( -offset, -offset, 0 ), 0, 0,
                           TextureManager.getInstance().getTextureID(
                                                             "shadow" ) );

obj.setTransparencyMode(Object3D.TRANSPARENCY_MODE_ADD);
obj.setTransparency(100);
obj.setLighting(Object3D.LIGHTING_NO_LIGHTS);
obj.setAdditionalColor(java.awt.Color.BLACK);

        obj.build();

obj.rotateX(-80);
obj.translate(0,0,transZ);

return(obj);
}

#15
Support / Re: object not translating as expected?
March 25, 2010, 05:40:26 PM
Thanks Egon.

Quoteif you do a translate(0,0,0.1f); or similar each frame, you should get the desired movement without modifying the translation matrix yourself

That's exactly what I did! After slowing down the loop it became apparent that modifiying the translation matrix was the wrong approach .... studying your demos got me sorted out.

--Tim.