Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - MichaelJPCT

#166
i found that it's not too much trouble.
#167
http://www.jpct.net/forum2/index.php/topic,3965.msg27866.html#msg27866

i search the forum and found the above thread and maybe it's the same question as i have.
i hope to use 1 shader instance for multiple objects but their uniform parameters are different.
is the solution in the above thread the only solution?
is it more troublesome than using 1 shader for each object? is it worth the trouble?
#168
Support / is result of setboundingbox permanent?
June 05, 2017, 07:31:54 AM
i want to enlarge a bound box as the shader for the object3d enlarges the object.
after i setboundingbox, is there anything in the engine that may reverse the bound box back to the smaller one?
i ask this question because another engine would recalculate the bound box automatically.
#169
Support / Re: How to do "Fog" in particular axis?
June 05, 2017, 06:55:14 AM
i guess it is called "layered fog". it requires shader and quite much calculation.
but somtimes you can do fake fog - use a transparent plane with foggy texture.
#170
never mind this. i would use multiple Object3D with the same Mesh for my purpose.
#171
the doc says Object3d should be added to 1 world only.
i wonder if an Object3d is added to 2 worlds and render them, what will happen, - only for rendering purpose.
1 of the World is used for shadow.
because the Object3d should be isolated in shadow World by my design, so i have to use another World, or i must call setVisibility many times.
#172
i think using shader is a good way, thanks.
#173
Support / add/reduce texture stages at runtime?
May 19, 2017, 06:20:06 AM
i have a terrain with 3 tex stages, each with different UV scaling. it was created with textureInfo/polygonManager technique.
i found tex stages affect render speed - the more stages the slower, on a powerVR 544 gpu.
i wonder if i could reduce/add a tex stage(a noise tex in my case) depending on camera distance.
is there a way to do so?
#174
Support / Re: setProjector for JPCT-AE ?
May 18, 2017, 09:37:59 AM
ok, thanks. i guess i can do it.
#175
Support / setProjector for JPCT-AE ?
May 18, 2017, 07:26:30 AM
i see Projector class in javadoc of jpct-ae, but no setProjector method in Texture class like the one in jpct-desktop. would it be added?
#176
i think the Light in JPCT is omni light (light ball, all directional).
if you want to simulate sunlight shining on earth, place a light 100000 units away, and make its intensity not fade.
if you want multiple lights with multiple directions in one scene, i don't know how.
#177
usage:
1) python and pyglet must be installed, tested python version is python 2.7, pyglet version is 1.24
2) in java main module, call these functions at the right time: Snd.init() -> Snd.upd() / Snd.pause() -> Snd.shutdown()
3) if any object is to have a sndGrp, call upd() of the sndGrp after the object is updated (updated in graphics frame).
4) in example, bettyGrp is not attached to any object, so Snd.upd() should be called, say, each graphics frame.
#178
here are all codes.
note that in my own code, i didn't write comments, these comments were added in about 2 hours as quick explanation.
these posted codes are a bit different from my own codes, as my codes have dependencies which other people don't have, but they should work with little modification.
and i am not a professional programmer, so some codes can be stupid.

JAVA SIDE

// NOTE: this is prototype and lacks error checking etc.

// sck=socket buf=buffer snd=sound grp=group
// upd=update serv=server fn=function

// short2string is a function that converts a short value to 2 characters,
// store in 2nd arg, 3rd arg is position offset

import java.io.*; import java.net.Socket;

public class Snd {
static byte fps; // used for timing, in this case framerate is constant
static float[] updResult=new float[2]; // pass data from game to this module
static byte[] outBuf=new byte[8]; // pass data from this module to socket
static Socket sck;
static boolean servOn=true, muteSnd;
static int servPort, maxVol=64000, maxPitch=64000;
static OutputStream outStream;
static String servFolder, sndFolder;
static SndType[] sndType;
static SndChannel[] sndChannel=new SndChannel[512];
static SndGrp bettyGrp; // example sound group

static short[] sndUpdFn={0,1,2};
static short[] sndTypeID={0,1,2};

static short[] sndGrpUpdFn={0,1,2};
static short[][] sndGrpItem={{0},{1},{2}};

static byte exampleFn1(SndChannel a) { updResult[0]=0f; return 1; }
static byte exampleFn2(SndChannel a) { updResult[0]=0.9f; return 1; }
static byte exampleFn3(SndChannel a) { updResult[0]=0.6f; updResult[1]=0.8f; return 2; }

interface GrpEnable { boolean enable(SndGrp a); }
interface UpdChannel { byte upd(SndChannel a); }

static GrpEnable[] grpEnable=new GrpEnable[] {
new GrpEnable() { public boolean enable(SndGrp a) { return true; } },
new GrpEnable() { public boolean enable(SndGrp a) { return true; } },
new GrpEnable() { public boolean enable(SndGrp a) { return true; } },
};

static UpdChannel[] updChannel=new UpdChannel[] {
new UpdChannel() { public byte upd(SndChannel a) { return exampleFn1(a); } },
new UpdChannel() { public byte upd(SndChannel a) { return exampleFn2(a); } },
new UpdChannel() { public byte upd(SndChannel a) { return exampleFn3(a); } },
};

static void shutdown() { if (!servOn) return;
try { outStream.close(); sck.close(); }
catch (Exception e) {} }

static void init() {
servFolder="sndServFolder/";
sndFolder="sounds/";
fps=30;
try { Runtime.getRuntime().exec(servFolder+"pysound.exe",null,new File(servFolder)); }
// in case python server app is not compiled as exe,
// run "python.exe folder/filename.py"

catch (Exception e) { servOn=false; return; }

readCFG();
readSndType();
if (!servOn) return;

for (byte m=50;m>0;m--) {
try {
Thread.sleep(300);
sck=new Socket("localhost",servPort); // only local machine, as example
outStream=sck.getOutputStream();
m=0; }
catch (Exception e) { if (m>1) continue; servOn=false; return; } }

outBuf[0]=-128; // one byte 0x80
bettyGrp=new SndGrp(0,null);
}

// setting.txt specifies socket port, same as python side.
// must be 5 digits in this simple function.
static void readCFG() {
short e=1024;
char[] u=new char[e];
int i;
String s,w="";
try { FileReader r=new FileReader(servFolder+"setting.txt"); r.read(u,0,e); r.close(); }
catch (Exception x) { servOn=false; return; }

s=new String(u); i=s.indexOf("PORT=");
if (i<0) { servOn=false; return; }

for (byte q=0;q<5;q++) { char h=u[5+i+q]; if ((h>47)&&(h<58)) w+=h; }
try { servPort=Integer.parseInt(w); }
catch (Exception x) { servOn=false; return; }
}

// an example: provide a binary file that stores sound type attributes
// the binary file also used by python side
static void readSndType() {
int h; byte[] u,q=new byte[8];
try { FileInputStream d=new FileInputStream(sndFolder+"snd.dat");
h=d.available(); u=new byte[h]; d.read(u); d.close(); }
catch (Exception x) { servOn=false; return; }
sndType=new SndType[h/8];
for (h=0;h<sndType.length;h++)
{ System.arraycopy(u,h*8,q,0,8); sndType[h]=new SndType(h,q); }
}

static void upd() { if (gamePaused||muteSnd||(!servOn)) return; bettyGrp.upd(); }

static void pause() { if (!servOn) return;
send(20);
SndChannel s;
for (short z=0;z<sndChannel.length;z++)
{ s=sndChannel[z]; if ((s!=null)&&s.playing) { s.playing=false; s.endFrm=0; } }
}

static void send(int a) { outBuf[1]=(byte)a;
try { outStream.write(outBuf); } catch (Exception i) {} }

static void delChannel(int a) { if (sndChannel[a]==null) return;
sndChannel[a].play(0f,0f); sndChannel[a]=null; }

static int findEmptyChannel() {
for (short n=0;n<sndChannel.length;n++)
{ if (sndChannel[n]==null) return n; } return -1; }

static class SndType { int typeID; float duration; boolean loop, single;
public SndType(int a,byte[] b) { typeID=a;
duration=b[3]*0.2f; single=(b[5]>0); loop=(b[3]==0); } }

static class SndChannel { boolean playing; SndType sndType;
byte[] chID=new byte[2]; short updFnID,channelID,servGrp; Object hostObj;
float vol=1f,pitch=1f; long endFrm;
// endFrm is frameID for timing in case framerate is constant
// servGrp means sound group in server, not used

// constructor: this sound channel is store in java and sent to python side
public SndChannel(SndType a,int b,int c,short d,Object e)
{ sndType=a; channelID=(short)b; servGrp=(short)c; updFnID=d; hostObj=e;
short2string_unsigned(b,chID,0); outBuf[4]=chID[0]; outBuf[5]=chID[1];
short2string_unsigned(a.typeID,outBuf,2);
short2string(c,outBuf,6); send(80); }

void play(float a,float b) {
if (a>0) { boolean s=false;
if (Math.abs(a-vol)>0.003f) { s=true; vol=a; }
if ((b>0)&&(Math.abs(b-pitch)>0.003f)) { s=true; pitch=b; }
if ((!playing)||(!sndType.loop)) { s=true; playing=true; }
if (!s) return;
short2string_unsigned(Math.min(maxVol,(int)(vol*16000)),outBuf,4);
short2string_unsigned(Math.min(maxPitch,(int)(pitch*16000)),outBuf,6); }

else if (!playing) return;
else { outBuf[4]=0; outBuf[5]=0; playing=false; endFrm=0; }

outBuf[2]=chID[0]; outBuf[3]=chID[1]; send(81); }

void upd() { if (updFnID<0) return;
byte r=updChannel[updFnID].upd(this);
if (!sndType.loop)
{ if (updResult[0]!=0) { endFrm=currentFrameID+(int)(sndType.duration*fps);
if (r==1) play(updResult[0],0);
else play(updResult[0],updResult[1]); } }
else if (updResult[0]!=0)
{ if (r==1) play(updResult[0],0);
else play(updResult[0],updResult[1]); }
else if (playing) play(0f,0f); }
}

static class SndGrp { boolean on; SndChannel[] item;
short grpTypeID,updFnID; Object hostObj;

public SndGrp(int a,Object b) { grpTypeID=(short)a; hostObj=b; short x;
updFnID=sndGrpUpdFn[a];
short[] k=sndGrpItem[a];
item=new SndChannel[k.length];
for (byte m=0;m<k.length;m++)
{ x=k[m]; int i=findEmptyChannel();
item[m]=new SndChannel(sndType[sndTypeID[x]],i,-1,sndUpdFn[x],b);
sndChannel[i]=item[m]; }
}

void upd() { boolean v=on;
if (updFnID>=0) on=grpEnable[updFnID].enable(this);
if (on) { for (byte j=0;j<item.length;j++) item[j].upd(); }
else if (v) { for (byte j=0;j<item.length;j++) item[j].play(0f,0f); }
}
}


}



PYTHON SIDE

import os,sys,time,socket; import pyglet.media
r=os.getcwd(); folder=[r+'\\',''];
folder[1]=folder[0][:-14]+'data\\snd\\'
# folder[0] is sound program folder , folder[1] is sound data folder, modify to suit your case

sndFileType=('.wav','.ogg','.mp3')
sndEOS=('pause','loop')
sndType=[]
sndChannel=[0 for r in range(512)]
sndGrp=[0 for r in range(256)]
netFn=[0 for r in range(256)]

def f020(a):
for h in xrange(len(sndChannel)):
o=sndChannel[h]
if o and o.playing:
o.player.pause(); o.playing=0
if o.sndType.duration: o.player.seek(0)

def f032(a): setGrp(string2short_unsigned(a[2:4]))

def f033(a):
k=sndGrp[string2short_unsigned(a[2:4])]
if k: k.play()

def f034(a):
k=sndGrp[string2short_unsigned(a[2:4])]
if k: k.pause()

def f080(a):
setChannel(string2short_unsigned(a[2:4]),string2short_unsigned(a[4:6]),string2short(a[6:8]))

def f081(a):
j=sndChannel[string2short_unsigned(a[2:4])]
if j: j.play(string2short_unsigned(a[4:6])*0.0000625,string2short_unsigned(a[6:8])*0.0000625)
# 0.0000625=1/16000, max 64000

def string2short_unsigned(a): return ord(a[0])+ord(a[1])*256

def string2short(a): b=ord(a[0])+ord(a[1])*256; return b-65536*bool(b>32767)

def setGrp(a):
g=sndGrp[a]; sndGrp[a]=SNDGRP(a)
if g:
for m in xrange(g.qty): sndChannel[g.item[m].channelID]=0

# arg: typeID, channelID, grpID
# sound group in server is not used yet
def setChannel(a,b,c):
n=sndChannel[b]; e=sndType[a]; sndChannel[b]=0
if n and n.grpID+1: sndGrp[n.grpID].subItem(n)
if e.src==0: e.loadSrc()
if e.src<0: return
sndChannel[b]=SNDCHANNEL(e,b)
if c+1 and sndGrp[c]: sndGrp[c].addItem(sndChannel[b])

# in example, all wav files are named like "00000000.wav" "00000001.wav" etc.
# sound type definition binary file format:
# each 8 bytes define a sound type, where:
# byte 0: file extention name type - wav ogg mp3
# byte 1/2: volume multiplier / pitch multiplier
# byte 3: sound duration, should be longer than sound file
# byte 4: preload or not
# byte 5: only allow single instance
class SNDTYPE(object):
def __init__(s,a,b):
s.typeID=a; s.fileName=('0'*8+str(a))[-8:]+sndFileType[ord(b[0])]; s.src=0
s.volMul=ord(b[1])*0.01; s.pitchMul=ord(b[2])*0.01; s.duration=ord(b[3])*0.2
s.preload=bool(ord(b[4])); s.single=bool(ord(b[5])); s.loop=bool(s.duration==0)
if s.preload: s.loadSrc()
def loadSrc(s):
if s.src: return
try: s.src=pyglet.media.load(folder[1]+s.fileName,streaming=False)
except (IOError,MediaException): s.src=-1; return

class SNDCHANNEL(object):
def __init__(s,a,b):
s.sndType=a; p=s.player=pyglet.media.Player(); p.queue(a.src); p.eos_action=sndEOS[a.loop]
s.pitchMul=p.pitch=a.pitchMul; s.volMul=p.volume=a.volMul; s.playing=s.tmpStop=0; s.vol=s.pitch=1
s.grpID=-1; s.channelID=b
def play(s,a,b):
if a>0:
if a-s.vol: s.player.volume=a*s.volMul; s.vol=a
if b and b-s.pitch: s.player.pitch=b*s.pitchMul; s.pitch=b
if s.playing-1: s.player.play(); s.playing=1
elif s.sndType.duration: s.player.seek(0); s.player.play()
elif s.playing:
s.player.pause(); s.playing=0
if s.sndType.duration: s.player.seek(0)

class SNDGRP(object):
def __init__(s,a): s.qty=0; s.item=[]; s.grpID=a
def addItem(s,a): s.item.append(a); s.qty+=1; a.grpID=s.grpID
def subItem(s,a): s.item.remove(a); s.qty-=1; a.grpID=-1
def play(s):
for h in xrange(s.qty):
i=s.item[h]
if i.tmpStop: i.player.play(); i.tmpStop=0; i.playing=1
def pause(s):
for h in xrange(s.qty):
i=s.item[h]
if i.playing: i.player.pause(); i.tmpStop=i.sndType.loop; i.playing=0

def initSnd():
try: w=file(folder[1]+'snd.dat',mode='rb'); r=w.read(); w.close()
except IOError: sys.exit()
for y in xrange(len(r)/8): sndType.append(SNDTYPE(y,r[y*8:y*8+8]))
for y in xrange(256):
try: netFn[y]=eval('f'+('000'+str(y))[-3:])
except NameError: continue

class CFG(object):
def __init__(s):
try:
i=file(folder[0]+'setting.txt'); g=i.read(); i.close()
p,r=g.find('PORT='),g.find('ACCEPT_REMOTE=')
if p<0 or r<0: sys.exit()
s.servPort,s.acceptRemote=int(g[p+5:p+10]),int(g[r+14:r+15])
except (IOError,ValueError): sys.exit()

class SERVERSOCKET(object):
def __init__(s,a):
s.port,s.remote=a.servPort,a.acceptRemote; s.running=1
k=s.sck=socket.socket(socket.AF_INET, socket.SOCK_STREAM); k.bind(('localhost'*(1-s.remote),s.port))
k.listen(1); k.setblocking(1); s.conn,s.connAddr=k.accept()
def shutdown(s): s.conn.close(); s.sck.close(); s.running=0
def run(s):
while s.running:
try:
n=s.conn.recv(8)
if not n: s.shutdown(); break
if n[0]=='\x80':
u=ord(n[1])
if netFn[u]: netFn[u](n)
except socket.error: s.shutdown()

cfg=CFG(); initSnd(); servSck=SERVERSOCKET(cfg); servSck.run()

#179
Feedback / would anyone want a sound engine for JPCT?
December 06, 2016, 11:58:16 PM
since JPCT doesn't provide a sound engine, and openAL looks too complicated for me, i searched for a sound engine which can play sound with volume and pitch control like this: setVolume(0.6) , setPitch(0.8).
i ended up using socket and a sound server made with a Python module.
socket is chosen for IPC because it's easy to program and another(and same) machine can be used as server.
Python module PyGLET is used as the server app because it's simple to use and it fullfils my need to control volume and pitch directly.
i have used the engine to play several sounds simutaneously and it looked reliable on local machine.
i can post all source code here, would anyone be interested in this?
#180
Feedback / Developing JPCT-AE program on Android device
September 16, 2016, 02:18:34 PM

Most people develop Android apps on a PC, then test run in Android Emulator, then do further testing on real Android device. But sometimes you don't need a PC to develop Android apps - you can write code, compile, and test run on an Android device.
I'm introducing the basic usage of AIDE - an Android app which can compile another Android app, with JPCT-AE.
For this to work well, you need these:
1) latest version (2016) of AIDE, free version will do
2) an Android device fast enough, such as quadcore A53/A72 CPU
3) a real keyboard
4) a not-so-small screen, say 10 inch or more
5) jpct-ae.jar

when AIDE is launched, if no project is opened, AIDE lets you choose whether to learn lessons or to code, just choose "for experts".
then if no project is created yet, you choose to create a new Gradle project.
by default AIDE creates project folder inside /mnt/sdcard/AppProjects/.
after project is created, you create a folder named "libs" in /YourProjectFolder/app/, then copy jpct-ae.jar into folder "libs".
then you are ready to write codes.

you can also copy your Gradle or Eclipse project folder from PC to Android device and continue with the old project.

here are some basic operations you need to know when you work with AIDE:
1) to handle files, choose menu VIEW, then FILES, then click on the file to open. on the top of screen there are file tabs that you can click and a menu shows up providing CLOSE FILE options.
2) to build, just click the button looking like a triangle on top right corner of the screen.
3) in menu VIEW, there is LOGCAT, which is important for debugging.
4) in menu PROJECTS, there is REFRESH BUILD, you need to do that sometimes - when you find your app behaves like un-updated.

In this way, you can spend most development time on Android device - writing code. you only need a traditional PC to process images, sound, or build 3d models.