All the code provide here is free to use and I'm not responsible for what you use it for.
I received a question that I found very interesting to share with everybody.
There is very few information about this subject on android and how to setup a correct working layout.
So in this topic, I'll answer the simple question : How to use Android Camera and JPCT-AE as a render that overlays the camera.
(Augmented reality concept)
== ALL THE SOURCES PROVIDED IN CODE QUOTES ARE NOT COMPLETE ! ==
You have to code your own engines around it to get it fully functional.
First we need to set up an XML layout.
Our minimum requirement is a glSurfaceView that's where we will draw 3D(JPCT engine),
and a SurfaceView to draw the camera preview.
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical" android:layout_width="fill_parent"
android:layout_height="fill_parent">
<android.opengl.GLSurfaceView android:id="@+id/glsurfaceview"
android:layout_width="fill_parent" android:layout_height="fill_parent" />
<SurfaceView android:id="@+id/surface_camera"
android:layout_width="fill_parent" android:layout_height="fill_parent"
android:layout_centerInParent="true" android:keepScreenOn="true" />
</FrameLayout>
This is to Initialize the window and the glSurfaceView.
// It talks from itself, please refer to android developer documentation.
getWindow().setFormat(PixelFormat.TRANSLUCENT);
// Fullscreen is not necessary... it's up to you.
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.THE_XML_LAYOUT_CREATED_BEFORE);
// attach our glSurfaceView to the one in the XML file.
glSurfaceView = (GLSurfaceView) findViewById(R.id.glsurfaceview);
Now let's create the camera and the engine.
This is an example of my own code, so perhaps it won't fill exactly your needs,
but you can be inspired by this one.
The following code is pretty easy to understand,
I create a new camera and I give a render to my glSurfaceView
and of course set the Translucent window (8888) pixel format and depth buffer to it.
(Without that your glSurfaceView will not support alpha channel and you will not see the camera layer.)
So basically :
1) Create the camera view.
2) Set up the glSurfaceView.
3) Set a Render to glSurfaceView.
4) Set the correct pixelformat to the glSurfaceView holder.
try{
cameraView = new CameraView(this.getApplicationContext(),
(SurfaceView) findViewById(R.id.surface_camera), imageCaptureCallback);
}
catch(Exception e){
e.printStackTrace();
}
// Translucent window 8888 pixel format and depth buffer
glSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
// GLEngine is a class I design to interact with JPCT and with all the basic function needed,
// create a world, render it, OnDrawFrame event etc.
glEngine = new GLEngine(getResources());
glSurfaceView.setRenderer(glEngine);
game = new Game(glEngine, (ImageView) findViewById(R.id.animation_screen), getResources(), this
.getBaseContext());
// Use a surface format with an Alpha channel:
glSurfaceView.getHolder().setFormat(PixelFormat.TRANSLUCENT);
// Start game
game.start();
Here is my CameraView class :
package com.dlcideas.ARescue.Camera;
import java.io.IOException;
import com.threed.jpct.Logger;
import android.content.Context;
import android.hardware.Camera;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
public class CameraView extends SurfaceView implements SurfaceHolder.Callback {
/**
* Create the cameraView and
*
* @param context
* @param surfaceView
*/
public CameraView(Context context, SurfaceView surfaceView,
ImageCaptureCallback imageCaptureCallback) {
super(context);
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
previewHolder = surfaceView.getHolder();
previewHolder.addCallback(this);
previewHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
//previewHolder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);
// Hold the reference of the caputreCallback (null yet, will be changed
// on SurfaceChanged).
this.imageCaptureCallback = imageCaptureCallback;
}
/**
* Initialize the hardware camera. holder The holder
*/
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
/**
*
*/
public void surfaceDestroyed(SurfaceHolder holder) {
this.onStop();
}
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
if (previewRunning)
camera.stopPreview();
Camera.Parameters p = camera.getParameters();
p.setPreviewSize(width, height);
// camera.setParameters(p);
try {
camera.setPreviewDisplay(holder);
} catch (IOException e) {
e.printStackTrace();
}
previewRunning = true;
Logger.log("camera callback huhihihihih", Logger.MESSAGE);
camera.startPreview();
imageCaptureCallback = new ImageCaptureCallback(camera, width, height);
//camera.startPreview();
}
public void onStop() {
// Surface will be destroyed when we return, so stop the preview.
// Because the CameraDevice object is not a shared resource, it's very
// important to release it when the activity is paused.
imageCaptureCallback.stopImageProcessing();
camera.setPreviewCallback(null);
camera.stopPreview();
previewRunning = false;
camera.release();
}
public void onResume() {
camera = Camera.open();
camera.setPreviewCallback(imageCaptureCallback);
previewRunning = true;
}
private Camera camera;
private SurfaceHolder previewHolder;
private boolean previewRunning;
private ImageCaptureCallback imageCaptureCallback;
}