Android: Tutorials on OpenGL ES, AR, VR, and GPGPU

This is a collection of tutorials on various topics in Android. Our focus is on developing code in C++ while leveraging the benefits of Java in the app. This allows us to reuse the native code across both Android and iOS platforms and considerably simplifies the development efforts.

  1. Create OpenGL ES context in Java and render with native C++: Since it is the first tutorial, it describes our approach and prerequisites in detail. We propose an alternate to NativeActivity in Android for developing apps based on native C++ code.
  2. Load shaders from native and display a colored triangle: Use Android’s asset manager in native code to load GLSL ES shaders. Use a vertex buffer object (VBO) to define an object in GLES.
  3. From touch gestures to model-view-projection (MVP) matrix with GLM: Introduces the ubiquitous MVP matrix in GLES. Describes how to recognize touch gestures in Java and convert them to MVP matrix using the GLM library in native.
  4. Use Assimp to load a 3D model: Shows how to load a OBJ model with Assimp in native code. Also uses OpenCV to load textures for the OBJ model.
  5. Virtual reality (VR) app using accelerometer and gyroscope sensors: Build a simple VR app that allows the user to explore a virtual environment by pointing the phone in different directions.
  6. Detect feature points and display camera images with OpenGL ES: Get a live image from the device camera using Java APIs, determine feature points in the image using OpenCV’s ORB in native code, and render the image with GLES shaders.
  7. Augmented reality (AR) app that overlays 3D object in the scene: Build a simple AR app that allows the user to point at the floor, double-tap, and create a reference marker. The marker is continuously tracked and a 3D model is rendered at the reference location.
  8. Introduction to GPGPU — Load a texture with floats and read it back: Initialize a floating point texture in GLES and load it with data from a OpenCV Mat. Read back contents of the texture into another OpenCV Mat.
  9. GPGPU — Perform computations in GPU and render output to texture: In progress.
  10. Feedback in GPGPU — Create a feedback loop where previously rendered output is used as input for next rendering pass: In progress.

We will add iOS counterparts of these tutorials soon so that we can demonstrate how to reuse native code in your project across both Android and iOS.