NDK Programmer's Guide
Teapot

This sample uses the OpenGL library to render the iconic Utah teapot. It particularly showcases the ndk_helper helper class, a collection of native helper functions required for implementing games and similar applications as native applications. This class provides:

  • an abstraction layer that handles certain NDK-specific behaviors (e.g., GLContext).
  • some helper functions that are useful but not present in the NDK, itself (e.g., tap detection).
  • wrappers for JNI calls for certain platform features (e.g., texture loading).

AndroidManifest.xml

The activity declaration here is not NativeActivity itself, but a sublass: TeapotNativeActivity.

    <activity
android:name="com.sample.teapot.TeapotNativeActivity"
            android:label="@string/app_name"
            android:configChanges="orientation|keyboardHidden">

The name of the .so file is libTeapotNativeActivity.so; the lib and .so are stripped off from the value assigned to android:value.

        <meta-data android:name="android.app.lib_name"
                android:value="TeapotNativeActivity" />

Application.mk

Define the minimum level of Android API Level support.

APP_PLATFORM := android-9

Build for all supported architectures.

APP_ABI := all

Specify the C++ runtime support library to use.

APP_STL := stlport_static

Java-side implementation: TeapotNativeActivity.java

This file handles activity lifecycle events, as well as displaying text on the screen.

// Our popup window, you will call it from your C/C++
code later


void setImmersiveSticky() {
    View decorView = getWindow().getDecorView();
    decorView.setSystemUiVisibility(View.SYSTEM_UI_FLAG_FULLSCREEN
            | View.SYSTEM_UI_FLAG_HIDE_NAVIGATION
            | View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY
            | View.SYSTEM_UI_FLAG_LAYOUT_FULLSCREEN
            | View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION
            | View.SYSTEM_UI_FLAG_LAYOUT_STABLE);
}

Native-side implementation: TeapotRenderer.h/.cpp

This code does the actual rendering of the teapot. It uses ndk_helper for matrix calculation, and to reposition the camera based on where the user taps:

ndk_helper::Mat4 mat_projection_;
ndk_helper::Mat4 mat_view_;
ndk_helper::Mat4 mat_model_;


ndk_helper::TapCamera* camera_;

Native-side implementation: TeapotNativeActivity.cpp

Include ndk_helper in your native source file, and define the helper-class name:

#include "NDKHelper.h"


//-------------------------------------------------------------------------
//Preprocessor
//-------------------------------------------------------------------------
#define HELPER_CLASS_NAME "com/sample/helper/NDKHelper" //Class name of helper
function

The first use of the ndk_helper class is to handle the EGL-related lifecycle, associating EGL context states (created/lost) with Android lifecycle events. It enables the application to preserve context information so that a destroyed activity can be restored. This is useful, for example, when the target machine is rotated (causing an activity to be destroyed, then immediately restored in the new orientation), or when the lock screen appears.

ndk_helper::GLContext* gl_context_; // handles
EGL-related lifecycle.

Next, ndk_helper provides touch control.

ndk_helper::DoubletapDetector doubletap_detector_;
ndk_helper::PinchDetector pinch_detector_;
ndk_helper::DragDetector drag_detector_;
ndk_helper::PerfMonitor monitor_;

And camera control (openGL view frustum).

ndk_helper::TapCamera tap_camera_;

As in the native-activity sample, the application prepares to use the sensors, using the native APIs provided in the NDK.

ASensorManager* sensor_manager_;
const ASensor* accelerometer_sensor_;
ASensorEventQueue* sensor_event_queue_;

The following functions are called in response to various Android lifecycle events and EGL context state changes, using various functionalities provided by ndk_helper via the Engine class.

void LoadResources();
void UnloadResources();
void DrawFrame();
void TermDisplay();
void TrimMemory();
bool IsReady();

This function calls back to the Java side to update the UI display.

void Engine::ShowUI()
{
    JNIEnv *jni;
    app_->activity->vm->AttachCurrentThread( &jni, NULL );


    //Default class retrieval
    jclass clazz = jni->GetObjectClass( app_->activity->clazz );
    jmethodID methodID = jni->GetMethodID( clazz, "showUI", "()V" );
    jni->CallVoidMethod( app_->activity->clazz, methodID );


    app_->activity->vm->DetachCurrentThread();
    return;
}

And this one calls back to the Java side to draw a text box superimposed on the screen rendered on the native side, and showing frame count.

void Engine::UpdateFPS( float fFPS )
{
    JNIEnv *jni;
    app_->activity->vm->AttachCurrentThread( &jni, NULL );


    //Default class retrieval
    jclass clazz = jni->GetObjectClass( app_->activity->clazz );
    jmethodID methodID = jni->GetMethodID( clazz, "updateFPS", "(F)V" );
    jni->CallVoidMethod( app_->activity->clazz, methodID, fFPS );


    app_->activity->vm->DetachCurrentThread();
    return;
}

The application gets the system clock and supplies it to the renderer for time-based animation based on real-time clock. For example, calculating momentum, where speed declines as a function of time.

renderer_.Update( monitor_.GetCurrentTime() );

Having earlier been set up to preserve context information, the application now checks whether GLcontext is still valid. If not, ndk-helper swaps the buffer, reinstantiating the GL context.

if( EGL_SUCCESS != gl_context_->Swap() )  // swaps
buffer.

The program passes touch-motion events to the gesture detector defined in the ndk_helper class. The gesture detector tracks multitouch gestures, such as pinch-and-drag, and sends a notification when triggered by any of these events.

if( AInputEvent_getType( event ) ==
AINPUT_EVENT_TYPE_MOTION )
{
    ndk_helper::GESTURE_STATE doubleTapState =
eng->doubletap_detector_.Detect( event );
    ndk_helper::GESTURE_STATE dragState = eng->drag_detector_.Detect( event
);
    ndk_helper::GESTURE_STATE pinchState = eng->pinch_detector_.Detect(
event );


    //Double tap detector has a priority over other detectors
    if( doubleTapState == ndk_helper::GESTURE_STATE_ACTION )
    {
        //Detect double tap
        eng->tap_camera_.Reset( true );
    }
    else
    {
        //Handle pinch state
        if( pinchState & ndk_helper::GESTURE_STATE_START )
        {
            //Start new pinch
            ndk_helper::Vec2 v1;
            ndk_helper::Vec2 v2;
            eng->pinch_detector_.GetPointers( v1, v2 );

ndk_helper also provides access to a vector-math library (vecmath.h), using it here to transform touch coordinates.

void Engine::TransformPosition( ndk_helper::Vec2& vec
) { vec = ndk_helper::Vec2( 2.0f, 2.0f ) * vec / ndk_helper::Vec2(
gl_context_->GetScreenWidth(), gl_context_->GetScreenHeight() )

ndk_helper::Vec2( 1.f, 1.f ); }

HandleCmd() handles commands posted from the android_native_app_glue library. For more information about what the messages mean, refer to the comments in the android_native_app_glue.h and .c source files.

void Engine::HandleCmd( struct android_app* app, int32_t
cmd ) { Engine* eng = (Engine*) app->userData; switch( cmd ) { case
APP_CMD_SAVE_STATE: break; case APP_CMD_INIT_WINDOW: // The window is being
shown, get it ready. if( app->window != NULL )

ndk_helper posts APP_CMD_INIT_WINDOW when android_app_glue receives an onNativeWindowCreated() callback from the system. Applications can normally perform window initializations, such as EGL initialization. They do this outside of the activity lifecycle, since the activity is not yet ready.

ndk_helper::JNIHelper::Init( state->activity,
HELPER_CLASS_NAME );


state->userData = &g_engine;
state->onAppCmd = Engine::HandleCmd;
state->onInputEvent = Engine::HandleInput;