Pages

Thursday 11 September 2014

Rooting + CyanogenMod 11 (Android KitKat 4.4.4) on LG G2 L01F DoCoMo

I have an Android LG G2 L01F contracted with DoCoMo in Japan. Can you add any more bloatware DoCoMo?? Geez there is a lot of unnecessary crap that you can't uninstall. And don't get me started on their Phone app, it's like a prepubescent kid who spent 3 hours at Android kiddy school developed it. It's buggy and the UI/UX is horrible as. If I click the call button I expect it to call, not react a minute later!

Anyway, for the reasons above and more I decided to take the brick risk and ROM my phone. And it worked. And it was rather simple. I did a temporary root, then upgraded from the ugly DoCoMo 4.2.2 to KitKat 4.4.4 with the CyanogenMod 11 ROM.


  1. Download the USB serial driver for LG from here. Install it. I did it on Windows 7 64bit.
  2. Make sure your phone is in Developer Mode (Google if not sure: Hint: Press Build Version 7 times) and USB Debugging is enabled.
  3. Download the latest nightly build of CM11 from here.
  4. Download the latest gapps-kk-* from goo.im/gapps if you want Google Play, G-mail ... i.e. all the Google apps (I used this)
  5. Unzip CM11 and change all "l01f" to "L01F" in the following files (I did this because of some silly case-sensitive model-number checking during the CM11 installation):
    META-INF/com/android/metadata
    META-INF/com/google/android/update-script
  6. Re-zip the file with the above modifications.
  7. adb push <CM11 zip file> /sdcard/   (if you don't know what adb is, look up setting up an Android Development environment, or find another way to copy a file to your phone's sdcard)
  8. Also copy gapps to the same place.
  9. Download the rootkit here, unzip it, plug in your phone and run Rootkit.bat. This rootkit seems to use some exploit to gain temporary root access. When prompted, press a key/enter to finish the batch script's installation. Your phone will reboot into CMW. If you reboot your phone normally you'll lose your root access and need to run Rootkit.bat again.
  10. If all went well you'll be booted into something called CMW, ClockworkMod W??? (forgot). You can use this to back up your current system and install a new one. DO A BACKUP NOW!
  11. Then wipe.
  12. Then choose install from zip and choose the CM11 zip file your copied to the sdcard. If all goes well it should install.
  13. Repeat the same for gapps.
  14. Reboot and wah-lah you should have CM11 4.4.4 on your Japanese LG G2 L01F!

Sunday 16 March 2014

Running ARM apps on Android x86

Well a year's passed since my last post. It's been a busy one at that! Feel free to skip to the bottom if you just want the instructions on how to get libhoudini running on Android x86...

These past few months I've being playing with a customized flavour of Android x86 and virtualization (I'm loving OpenStack!). For those of you who don't know it's a project to get Android running on various Intel x86 architecture based platforms based of the vanilla Android Open Source Project from Google.. Yes, that's right - you can run native Android on your PC or Mac! No more horrendously slow Android Emulator.

Well, almost.

Android x86 runs super fast on most bits of Intel hardware. The majority of apps will function fine as well, at least if they're 100% written in Java. But the fact is a lot of apps also have native libraries built into them (compiled using the NDK and written usually in C or C++). When developers prepare an app for release they can choose what architecture(s) to compile their native library binaries for. Most will at least compile for ARM but in the interest of cutting down on the size of the APK a lot skip x86 and others (MIPS etc). Google Play does apparently allow you to have a separate APK for each architecture so you won't need to worry about having a few useless-to-the-many MBs of x86 code in the APK that your ARM users install. Many developers maybe unaware of this feature, and I'm sure some developers are just not aware that x86 versions of Android devices are emerging more and more.

So where does that leave us? You emulate ARM architecture on your x86 hardware. From what little research I've done into the Android Emulator itself, I believe the entire OS is ARM based so there's a tonne of constant emulation that needs to be done, hence the major performance cost. With Android x86, most of the code is executed natively (i.e. critical/normal programs run fast) and the code that is only available in ARM compiled binaries is translated real-time into instructions your x86 CPU can understand.

This is done with a little library called Houdini. libhoudini is a Binary Translator (translates ARM ABI to x86 ABI) developed by Intel and is close sourced. The information on the internet is really sparse and there's a lot of broken links so I'm hoping this blog article will stay around for a while and be helpful to some. From what I gather the original libhoudini libraries were ripped from a Lenono K900 device that runs x86 hardware. Intel must've provided their binaries to Lenovo for release. That work was done by the BuilDroid project which seems to have changed names to AndroVM and now Genymotion (a great online Android Emulator that allows you to choose a range of device types, resolution types, Android versions etc).

So on to the fun stuff, the instructions on how to get libhoudini installed and configured on Android x86. I just test against a branch based on the 4.2.2 tree so I can't say how well it'll work on other versions.

1. Have the AOSP project source code download and the build environment all setup. We'll call this $AOSP.
2. Download and unarchive libhoudini-patch-and-binaries.tgz.
3. Copy the contents of <...>/libhoudini-patch-and-binaries/system/lib to $AOSP/out/target/product/<product name>/system/lib
4. cd $AOSP/dalvik
5. git apply -v <...>/libhoudini-patch-and-binaries/jb-houdini.patch
6. Add the following line to $AOSP/device/<your hardware manufacturer>/<device name>/system.prop (e.g. device/ibm/thinkpad/system.prop):
ro.product.cpu.abi2=armeabi
7. Build and deploy your Android x86 OS.
8. Try running an app with ARM-only libraries and if it doesn't crash you're lucky!

I had success running my own apps, Facebook and Angry Birds.
I didn't have much luck with Unity based games.

Sunday 10 March 2013

Chroma Keying (transparent background) with Vuforia AR Video Playback

It is possible to make augmented reality video playback with the Vuforia platform just that little bit more realistic by chroma keying the video. Chroma keying, aka the green screen effect, is an old trick used in the television industry of putting a solid colour background behind the person being filmed so they can replace that one colour in post-production with a completely different scene. Obviously this requires the person in the foreground to avoid wearing anything close to that colour otherwise they'll appear patchy/transparent in places.

I managed to come up with a basic way to implement this with the video playback functionality offered in the Vuforia SDK. Here's a snippet of the OpenGL ES shader code that I used:

static const char videoPlaybackFragmentShader[] =
  "#extension GL_OES_EGL_image_external : require \n"
  "precision mediump float; \n"
  "uniform samplerExternalOES texSamplerOES; \n"
  "   varying vec2 texCoord;\n"
  "   varying vec2 texdim0;\n"
  "   void main()\n\n"
  "   {\n"
  "       vec3 keying_color = vec3(%f, %f, %f);\n"
  "       float thresh = 0.8; // [0, 1.732]\n"
  "       float slope = 0.2; // [0, 1]\n"
  "       vec3 input_color = texture2D(texSamplerOES, texCoord).rgb;\n"
  "       float d = abs(length(abs(keying_color.rgb - input_color.rgb)));\n"
  "       float edge0 = thresh * (1.0 - slope);\n"
  "       float alpha = smoothstep(edge0, thresh, d);\n"
  "       gl_FragColor = vec4(input_color, alpha);\n"
  "   }";
UPDATE: I forgot to add I also needed to run these commands in order for the fragment shader above to work:
glDepthFunc(GL_LEQUAL);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
UPDATE #2: 2013/04/04:
Some people were reporting they had problems with their video content turning black. Re-reading this article I realised I'd missed out the vertex shader code as well so here it is. If I recall rightly, the variable "texCoord" that is used in the fragment shader must have the same name as in the vertex shader. Using the sample Video Playback app with the Vuforia SDK, if there are any problems with compiling your shader source code then there'll be some errors printed in LogCat that you can use to help debug.
static const char* videoPlaybackVertexShader =
    "attribute vec4 vertexPosition; \n"
    "attribute vec4 vertexNormal; \n"
    "attribute vec2 vertexTexCoord; \n"
    "varying vec2 texCoord; \n"
    "varying vec4 normal; \n"
    "uniform mat4 modelViewProjectionMatrix; \n"
    "void main() \n"
    "{ \n"
    "   gl_Position = modelViewProjectionMatrix * vertexPosition; \n"
    "   normal = vertexNormal; \n"
    "   texCoord = vertexTexCoord; \n"
    "} \n";
In the VideoPlayback sample code, then in VideoPlayback.cpp, inside the renderFrame() function I inserted these 3 lines above all the calls to SampleUtils::scalePoseMatrix. For those of you who have studied the source code of the Video Playback project that can be downloaded with the Vuforia SDK you can see this is a modified version the source code for the fragment shader for the video frames. What it does is, for any colour that is close to the specified colour that is processed by the fragment shader (any pixel to render on the screen from rasterization), that colour will be made transparent by modifying its alpha channel. The same code can be used for the keyframe's fragment shader too. Notice I have 3 %fs. These, in order, are the RGB values of the colour to chroma key - i.e. to make transparent. I used sprintf to replace the string with actual float values. Note the range of these numbers is 0.0-1.0. They're float numbers, not the integer range you often see of 0-255.
Unfortunately I don't have a project I can share this time as the source code I originally implemented this in is mixed in with other code I'm not able to release publicly.

To be a little more adventurous and dynamic you could consider implementing a function to dynamically change the value of the colour that is made transparent on-the-fly using a colour picker dialog or some other input method. Other input methods could be a text input field for the 0x RGB value or a different kind of colour picker like the ones often found in paint programs where the user touches a part of the screen and the colour value of that pixel is read in as the input. This would be handy so users could just touch the area of the background they want to remove and it would automatically detect the colour.

Friday 11 January 2013

Fixing Slow Camera Animations in Unity

After importing one of any number of complex 3ds Max models as FBX files into my Unity scene and animating a single Unity camera, I found the camera's movement was extremely slow at anywhere from 0.3-3 FPS. The model was a static room and all I wanted animated was the camera itself. Everything was normal speed in the little Camera Preview box but on both playback on my laptop (which is medium-high spec) and on my Android it was extremely slow.

What I found happening was that after importing, say TestObject.fbx, then Unity was automatically enabling the Animator for the TestObject in my scene. Simply select TestObject in your Hierarchy panel in Unity and uncheck the Animator section found in the Inspector panel.

Saturday 5 January 2013

Importing 3DS Max Models into Unity 3D for iPhone/Android Games

I have started delving into the realm of 3D modeling for a Vuforia augmented reality based game I am developing. To that end I've been spending the past 2 weeks learning 3ds Max and some more Unity 3D (the game engine I am using to create the Vuforia AR based game). This article covers importing a 3D scene from 3ds Max into Unity with steps to optimize your scene for low spec devices by reducing the number of polygons, textures and draw calls.

I purchased a 3D model of a complex scene off of Turbo Squid. I'm a bit of a novice when it comes to 3D modeling so I decided to purchase an already made scene and modify it to suit my needs. As my game's intended platforms are Android and iOS devices I knew optimization would be key to good game experience. Following the standard Unity instructions on how to import a model from 3ds Max yielded me a few problems:
  • Importing the .max file directly into my Unity project's assets timed out after Unity gave up 3 minutes later trying to internally convert the .max file to an .fbx file.
  • Manually exporting the scene from 3ds to an .fbx file and then importing that into Unity gave me the scene, and despite the textures being imported into Unity no textures were being applied (everything was just gray).
  • Importing the .fbx into Unity took a very long time.
  • Deploying Unity app to my Android with just the gray scene crashed the app (more precisely, as soon as a Vuforia target was found in the camera frame).
  • A bunch of the Vray materials and lights were not exportable to .fbx.
The conclusion? My 90 MB max project with 70 MB of textures was too much for my 4 core laptop to achieve a workflow with any kind of acceptable speed, let alone the idea of trying to get the model to load on Android/iPhones ;-) What's more, Vray materials aren't compatible for exporting to .fbx so it would've been better if I'd got a 3D model using the standard renderer with standard materials. But as I had paid a nice sum for the Vray rendered version and liked the model, I pushed on.

The answer? Merge a bunch of objects all with their own materials/textures into a single mesh object and bake a rendered view of that mesh into a single texture (containing all the other textures/lighting/shadows) that then need be the only texture applied to the mesh as a material. In other words, we merge a bunch of multiple objects with different images attached to their surfaces into a single object with a single image file (texture).

Importing a multi-textured 3ds Max scene into Unity

I wrote the instructions below aimed at complete newbies who want to optimize their complex 3D scene suitable for a low spec handheld device like Android or iPhone. This is achieved by combining several objects into one mesh and baking a single texture image to apply to the mesh as the only material. The baked texture image will come from a rendered output of the object complete with its standard or Vray materials, any multisub object materials, shadows, lighting etc. The scene I happened to buy used the Vray renderer but the same instructions should be applicable to models that use the standard renderer as well.

NB. There is a plugin for 3ds Max called Flatiron that be used to bake entire scenes into one image texture. The full version costs a bit but if you have luck with the trial version you may wish to purchase it. I didn't have much luck so I decided to go the manual route, as detailed below.
  1. Open your 3D model scene in 3DS Max. I use 3ds Max 2010 with the trial version of Vray 1.5 SP4.
  2. Select the objects that have their own standard (and/or multi/sub) materials that you want to turn into a single mesh object with a single material.
    * To select objects, make sure the Select Object icon is the highlighted icon on the top horizontal toolbar (just below the row of main menus you should see a drop down list of All, Geometry, Shapes, Cameras, Lights etc. and beside that the Select Object icon which is a cube with a mouse pointer - in Max 2010 at least).
    Press Ctrl+left mouse click to select multiple objects within your scene. You also can click on the Select by Name icon and choose the objects from their names.
    Ctrl+A
    (or Edit -> Select All) will select all objects currently displayed in the viewport.
  3. Optional: press Alt+Q or Tools -> Isolate Selection to display only the selected objects in the current viewport.
  4. Select Group -> Ungroup as many times until all the selected objects are ungrouped. The Ungrouped menu item will become grayed out. This is to make sure you can Attach all items (covered below).
  5. Select one object, right click it and choose Convert To -> Convert to Editable Mesh.
  6. With the same object still selected, on the right hand panel of Max, select the Modify icon (2nd icon from the left on the top of the right-hand panel in 2010) and directly below that is the object's name, Modifier List (a drop down menu) and then the stack list (think layers of operations done on the object). Currently the only layer in the stack list should be Editable Mesh. After clicking Editable Mesh in the stack list you will see a few Editable Mesh related configuration sections appear below it.
  7. In the Edit Geometry section, click Attach List and the Attach List dialog window will popup. Assuming you're in isolated mode (Alt+Q a few steps back) then you can happily select all objects in that list and click the Attach button in the dialog window. This operation will effectively make all your objects into one single mesh object. Alternatively, click the Attach button next to Attach List and select objects individually if you're not in isolated mode. With the newly created Editable Mesh object selected, select Unwrap UVW from the Modifier List. You should now see in order from bottom to top, Editable Mesh and Unwrap UVW in your object's stack list.
  8. In the Unwrap UVW's Parameters panel, look for the section called Channel and change Map Channel to 2. If you get a warning about changing channels with the options to Move or Abandon, go with Move. I noticed this started appearing after I upgraded to 3ds Max 2013 and after trial and error Move seemed to be the best option.
  9. Click the small + icon next to Unwrap UVW in the stack list, select Face and press Ctrl+A (or Edit -> Select All) to select all faces.
  10. Click the Edit... button in the same Parameters panel and the UVW Editor window will open.
  11. Open Mapping -> Flatten Mapping and with the default values press OK.
  12. Depending on how many faces your object has, a while later you should see a bunch of green/white wireframe like figures neatly laid out in a square region representing the unwrapped version of all faces of your object. This so-called UV map will be the map of coordinates that tell how the 2D texture image is to be wrapped around the 3D mesh object.
  13. Close the UVW Editor.
  14. Select Unwrap UVW in the stack list.
  15. Making sure the object is still selected, press 0 or Rendering -> Render To Texture to open the Render To Texture dialog window. Here is where we generate the texture image by rendering the 3D model in Max and following the unwrapped UV map coordinates, the object's faces' materials will be baked onto that area of the texture image file.
  16. Under the General Settings section, Output set Path to a folder where you want the output image file saved.
  17. Under the Objects to Bake section, you should see the mesh object's name in the list.
  18. Under the Mapping Coordinates section, select Use Existing Channel next to Object.
  19. Select 2 from the drop down list next to Channel. If there is no 2, then it means the UV map was not unwrapped properly to channel 2.
  20. Under the Output section, click the Add button. If you are using the Vray renderer, select VRayCompleteMap. If you are using the standard renderer you can probably use CompleteMap but I haven't tried with that one. Using the complete map will ensure not only materials, but also lighting and shadows etc are baked in.
  21. Under the Selected Element Common Settings section, you should see something like:
    • Name: VRayCompleteMap
    • File Name and Type: <object name>VRayCompleteMap.xxx
    • Target Map Slot: Diffuse Map (select Diffuse map if its in the list)
    • Element Type: VRayCompleteMap
    • Width/height dimensions: the resolution of the output texture image.

      A few notes about this section:
      • If you select multiple mesh objects to render to texture at the same time (which is completely possible, just select them all before opening the Render To Texture dialog and they will all be listed under Objects to Bake), then the file name and type cannot be changed from the UI.
      • You can change the default type by:
        1. Open C:\Program Files\Autodesk\3ds Max 2010\ui\macroscripts\Macro_BakeTextures.mcr in a text editor.
        2. Change local    defaultFileType = ".tga" to local    defaultFileType = ".png"
          TGA seems to be the Max default, I prefer PNG but you can use JPG or TIF too.
        3. Restart Max.
      • Choosing the right resolution is a bit of an experiment and I'm still experimenting and discovering. It all comes down to the specs of the target devices that your game will run on, how many objects you have in total, how many faces in one mesh object, the actual content of the object (is it a far away stone in the background or a closeup thing requiring more detail?). So far I've been playing around with 512x512 to 1024x1024 resolutions with 10-30 objects and the scene has been rendering on my Android okay. This is a static environment though and those numbers still feel too high. You'll just have to experiment here. My approach is bake all your textures to high resolution and then scale them down as you test the level of detail of actual content rendered on a device until you're happy with the lowest image size vs content detail.
      • When selecting multiple objects to bake at the same time you may not be able to select Diffuse map in the Target Map Slot setting and may get some warnings about it not being set for some objects. So far I've been fine just ignoring these warnings or just having the Target Map Slot for multiple objects set to "varies".
  22. Down the very bottom, I set both Views and Render to Original. Then click the Render button.
  23. The render process may take some time. When it's finished it's not obvious with a "Render finished" message or anything. You should just see the preview window of the rendered texture image appear to be finished (and the fact your computer is no longer slowed down to a snail's pace and you can actually open the menus in Max etc!)
  24. Back on the stack list with Unwrap UVW selected, change the Map Channel that you set to 2 back to 1.
  25. Right-click Unwrap UVW and select Collapse All. This collapses all the layers in your stack into one layer.
  26. Open the Material Editor (4th icon from right on top toolbar in 2010 - the icon with a chequered circle with a small square bottom right).
  27. In one of the free slots (the spheres), drag/drop the texture image that was created from Windows Explorer. You will now have a new standard material with the image file attached as a bitmap texture on the diffuse channel.
  28. Now drag/drop that material slot (the sphere) onto the actual mesh object in your scene. And presto! You now have a single mesh object with a single rendered-to-texture image material.
  29. Things may look a bit fuzzy in 3ds Max viewport more than they actually do in Unity, I'm not sure why.
  30. Next I simply export the selected object(s) as an FBX (ensuring Insert Media is selected in the FBX settings window) and then import that FBX file into Unity and I have the same textured object to play with in Unity!
It's a long process, and quite manual. Same may prefer using Flatiron but I had more control over the individual objects here. If anyone with more experience than me finds an error in an article, please contact me or leave a comment and I'll amend it.

Saturday 15 December 2012

Modular Approach to Implementing Android Preferences

This article details the solution I engineered to implement a modular approach for handling Android preferences. The objective is to display common preferences in an Android library on the same screen as custom preferences that are specific to the game (app) that uses that library.

I came up with this solution because I am building an augmented reality game engine library containing common code that I plan to use with multiple game projects. On the options screen I want to display both preferences common to all games (and thus the code was in the library itself) and also preferences specific to each game with an efficient way of implementing it in the game project with a minimal amount of coding.

The end result is illustrated in the following image.


The preference category titles and preference widgets (the checkboxes) that are in the red squares are settings common to all games, and thus are implemented in the game library. The green squares are preferences/titles that are specific to that particular game, and thus are implemented in the game project code.

I will explain below the important files in the game and game library projects. Download links for both projects are available at the bottom.

The Game Project


First we have the the Java class implementation that extends the AbstractOptionsScreenActivity abstract class in the game library which is where most of the Java code is. By just calling the super.onCreate method all the hard work is done for us as soon as the activity is created.

Optionally, you may implement some code in the abstract addExtraCustomPreferenceResources like I have here. This example will add a new CustomPreferenceScreen object to the array list that is referenced when the list of preferences is created. The object contains the file name of an XML resource file that defines a preference screen and the title of the preference category under which all the preferences inside the resource file will be added to.
public class OptionsScreenActivityImpl extends AbstractOptionsScreenActivity {
 @Override
 public void onCreate(Bundle savedInstanceState) {
  super.onCreate(savedInstanceState);
 }

 @Override
 protected void addExtraCustomPreferenceResources() {
  mCustomPreferenceScreensList.add(new CustomPreferenceScreen("test_preferences", "My New Category"));
 }
}

The above code tells the library that the game project has a test_preferences.xml file in the project's res/xml folder and to add all its children preferences under a new preference category titled "My New Category". The current implementation of the game library adds 4 default preference categories - Game Settings, Graphics, Audio and Reporting Problems - to the list. Any new preference categories defined in addExtraCustomPreferenceResources will be rendered at the bottom of the list. The order that the CustomPreferenceScreen objects are added to the list will reflect the order they are appended to the screen.

<?xml version="1.0" encoding="utf-8"?>
<PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android" >

    <CheckBoxPreference
        android:defaultValue="false"
        android:key="test key"
        android:summary="test summary"
        android:title="test title" >
    </CheckBoxPreference>

</PreferenceScreen>

The 2 preference categories, Game Settings and Graphics, are created by the library but it is possible to define preference widgets in the XML resource files in the game project as this example does. There are files with the same name in the game library project but the game project's versions take precedence when the game runs. The game library will add the preference widgets defined in the game project's implementation of these files.

<?xml version="1.0" encoding="utf-8"?>
<PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android" >

    <CheckBoxPreference
        android:defaultValue="false"
        android:key="options_screen_prefs_description_game_setting"
        android:summary="@string/options_screen_prefs_description_game_setting"
        android:title="@string/options_screen_prefs_name_game_setting" >
    </CheckBoxPreference>

</PreferenceScreen>

<?xml version="1.0" encoding="utf-8"?>
<PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android" >

    <!-- My Game specific graphic setting 1 -->

    <CheckBoxPreference
        android:defaultValue="false"
        android:key="options_screen_prefs_description_game_graphics1"
        android:summary="@string/options_screen_prefs_description_game_graphics1"
        android:title="@string/options_screen_prefs_name_game_graphics_setting1" >
    </CheckBoxPreference>

    <!-- My Game specific graphic setting 2 -->

    <CheckBoxPreference
        android:defaultValue="false"
        android:key="options_screen_prefs_description_game_graphics2"
        android:summary="@string/options_screen_prefs_description_game_graphics2"
        android:title="@string/options_screen_prefs_name_game_graphics_setting2" >
    </CheckBoxPreference>

</PreferenceScreen>


The Game Library Project


The root_preferences.xml defines the top level preference screen on the options screen. Here we define the 4 default preference categories (Game Settings, Graphics, Audio and Reporting Problems). Note that the Game Settings doesn't have any common preference widgets defined in the library (but it would be okay to add some). The Graphics has the single "enable Augmented Reality" checkbox preference widget. Both categories have other widgets added that are defined in the game project. The other categories, Audio and Reporting Problems, are common to all games and are not modified by the game project at all.
<?xml version="1.0" encoding="utf-8"?>
<PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android" >

    <!-- Game settings -->

    <PreferenceCategory
        android:key="pref_key_game_settings"
        android:title="@string/options_screen_prefs_category_title_game_settings" >
    </PreferenceCategory>

    <!-- Graphics settings -->

    <PreferenceCategory
        android:key="pref_key_graphics_settings"
        android:title="@string/options_screen_prefs_category_title_graphics" >

        <!-- Enable augmented reality display -->

        <CheckBoxPreference
            android:defaultValue="false"
            android:key="pref_key_graphics_settings_ar_enable"
            android:summary="@string/options_screen_prefs_description_ar"
            android:title="@string/options_screen_prefs_name_ar" >
        </CheckBoxPreference>
    </PreferenceCategory>

    <!-- Audio settings -->

    <PreferenceCategory
        android:key="pref_key_audio_settings"
        android:title="@string/options_screen_prefs_category_title_audio" >

        <!-- Enable sound effects -->

        <CheckBoxPreference
            android:defaultValue="false"
            android:key="pref_key_audio_settings_sound_effects_enable"
            android:summary="@string/options_screen_prefs_description_sound_effects"
            android:title="@string/options_screen_prefs_name_sound_effects" >
        </CheckBoxPreference>

        <!-- Enable background music -->

        <CheckBoxPreference
            android:defaultValue="false"
            android:key="pref_key_audio_settings_background_music_enable"
            android:summary="@string/options_screen_prefs_description_background_music"
            android:title="@string/options_screen_prefs_name_background_music" >
        </CheckBoxPreference>
    </PreferenceCategory>

    <!-- Reporting problems settings -->

    <PreferenceCategory
        android:key="pref_key_reporting_problems_settings"
        android:title="@string/options_screen_prefs_category_title_reporting_problems" >

        <!-- Enable automatic error reporting -->

        <CheckBoxPreference
            android:defaultValue="false"
            android:key="pref_key_reporting_problems_settings_automatic_error_reporting_enable"
            android:summary="@string/options_screen_prefs_description_automatic_error_reporting"
            android:title="@string/options_screen_prefs_name_automatic_error_reporting" >
        </CheckBoxPreference>

        <!-- Enable trace logging -->

        <CheckBoxPreference
            android:defaultValue="false"
            android:key="pref_key_reporting_problems_settings_trace_logging_enable"
            android:summary="@string/options_screen_prefs_description_trace_logging"
            android:title="@string/options_screen_prefs_name_trace_logging" >
        </CheckBoxPreference>
    </PreferenceCategory>

</PreferenceScreen>
The abstract class below is where the bulk of the Java code is. The in-code comments in conjunction with running the actual projects in debug mode will be the best way to understand the finer details of how it all works but in brief:

  • The game runs its activity which calls its abstract parent's constructor
  • That constructor sets a custom layout with a ListView which will be the root view in the preferences hierarchy.
  • An array list is initialised containing the custom preferences that are to be added to the options screen. The 2 XML resource file names for Game Settings and Graphics and their associated preference categories keys are defined here.
  • The way preferences should be handled on Android depends on what version of Android is being used; either use a PreferenceActivity for older versions or PreferenceFragment since Honeycomb (Android 3.0). This solution handles both old and new methods automatically.
  • The root preference screen is added.
  • The list of custom preference screens is iterated.
  • The defined XML resource file containing the preference screen is inflated and instantiated as a PreferenceScreen object. Because the Android API hides the method for inflating preference XML files, Java reflection is used.
  • If the custom preference screen has a corresponding category name defined, then the root preference screen is scanned for an already existing preference category with a key that matches that name. If a match is found, then all the children preference widgets in the XML resource file are appended to the existing preference category.
  • If no match is found, then a new preference category is created and appended to the end of the root preference screen list. If a corresponding category name is defined, then the title of the new preference category will be set to that.
  • The result is all preference screens/categories/widgets are merged together and rendered as one logical screen.
 
public abstract class AbstractOptionsScreenActivity extends PreferenceActivity {
 private final static String TAG = "GameLib";
 
 /**
  * Encapsulation class for a single custom preference XML resource and the optional {@link PreferenceCategory}
  * that its inner {@link Preference} widgets are located in.
  */
 protected class CustomPreferenceScreen {
  /** Name of the XML resource file that contains the custom preference screen */
  public String xmlResFilename;
  
  /**
   * If the name matches the key of an existing preference category in the root preference screen,
   * then the custom preference screen's contents are appended here. Otherwise, a new preference
   * category with the title set to the value of this name will be created and the contents added
   * there.
   */
  public String preferenceCategoryName;

  /**
   * Encapsulating class for an XML resource file that contains custom preferences implemented in the
   * game project that is using this library.
   * 
   * @param xmlResFilename The XML resource filename in the res/xml directory.
   * @param categoryName The key name of the existing preference category to add to, or the title value of
   * the new preference category to create. Can be null (and will append a new preference category with no title).
   */
  public CustomPreferenceScreen(String xmlResFilename, String categoryName) {
   this.xmlResFilename = xmlResFilename;
   this.preferenceCategoryName = categoryName;
  }
 }

 /**
  * Array list of {@link CustomPreferenceScreen}s for containing the list of custom preference screen to show on
  * the options screen.
  */
 protected ArrayList<CustomPreferenceScreen> mCustomPreferenceScreensList;

 private RootPreferencesFragment mRootPreferencesFragment;

 @Override
 public void onCreate(Bundle savedInstanceState) {
  super.onCreate(savedInstanceState);

  setContentView(R.layout.preferences_screen_layout);

  loadPreferences();
 }

 @Override
 public void onDestroy() {
  super.onDestroy();

  if (mRootPreferencesFragment != null) {
   // Avoid memory leaks with stale context references
   mRootPreferencesFragment.mContext = null;
  }
 }

 /**
  * Implement this method to add extra preference screen resources to the list that is referenced when
  * rendering the options screen.<br>
  * <br>
  * List is {@link #mCustomPreferenceScreensList} and element objects to insert are {@link CustomPreferenceScreen}.
  */
 protected abstract void addExtraCustomPreferenceResources();

 protected final void loadPreferences() {
  // Initialise the list of custom preference screen resources.
  // At a minimum there are game_preferences.xml and graphics_preferences.xml whose contents should
  // be implemented (i.e. the game specific preferences defined) in the game project using the library.
  mCustomPreferenceScreensList = new ArrayList<CustomPreferenceScreen>();

  // Map that the preferences defined in games_preferences.xml belong in the preference category with
  // the key pref_key_game_settings
  mCustomPreferenceScreensList.add(new CustomPreferenceScreen("game_preferences", "pref_key_game_settings"));

  // Map that the preferences defined in graphics_preferences.xml belong in the preference category with
  // the key pref_key_graphics_settings
  mCustomPreferenceScreensList.add(new CustomPreferenceScreen("graphics_preferences", "pref_key_graphics_settings"));

  // Optionally, any subclass may implement the following method to add further add entries to the list so
  // we invoke that method here to ensure they're loaded in.
  addExtraCustomPreferenceResources();

  // The way to handle preferences changed since API 11 (Honeycomb). From API 11 onwards, it is recommended to use
  // PreferenceFragment inside of a normal Activity instead of PreferenceActivity. Here we will handle both for the
  // older Android APIs and the new ones.
  // Source: http://developer.android.com/guide/topics/ui/settings.html
  if (isFragmentSupported()) {
   loadPreferencesWithPreferenceFragment();
  } else {
   loadPreferencesWithPreferenceActivity();
  }
 }

 @SuppressWarnings("deprecation")
 private void loadPreferencesWithPreferenceActivity() {

  Log.d(TAG, "Loading preferences with the old preference activity method");

  // Load the root preferences using PreferenceActivity (which this class subclasses).
  // These preferences are common to all games that use the library.
  addPreferencesFromResource(R.xml.root_preferences);

  // Render the custom preference screens and their widgets
  addCustomPreferenceScreens(this, getPreferenceScreen(), mCustomPreferenceScreensList);
 }

 @TargetApi(11)
 private void loadPreferencesWithPreferenceFragment() {
  Log.d(TAG, "Loading preferences with the new preference fragment method");

  // Instantiate the root preferences fragment
  mRootPreferencesFragment = new RootPreferencesFragment();
  mRootPreferencesFragment.mContext = this;
  mRootPreferencesFragment.mCustomPreferenceScreenResourcesList = this.mCustomPreferenceScreensList;
  getFragmentManager().beginTransaction().replace(android.R.id.content, mRootPreferencesFragment).commit();
 }

 private static void addCustomPreferenceScreens(
   Context context,
   PreferenceScreen rootPreferenceScreen,
   ArrayList<CustomPreferenceScreen> customPreferenceScreensList) {
  if (context == null || rootPreferenceScreen == null || customPreferenceScreensList == null) {
   return;
  }

  // Iterate over the list of custom preference screens
  for (CustomPreferenceScreen customPrefScreen : customPreferenceScreensList) {
   if (customPrefScreen != null && customPrefScreen.xmlResFilename != null) {
    // Reference to the preference category to place the preference widgets in
    PreferenceCategory currentPreferenceCategory = null;

    // Search for a category matching the name of the specified key
    for (int i = 0; i < rootPreferenceScreen.getPreferenceCount(); i  ) {
     // Get the current preference object in the root preference screen's hierarchy
     Preference currentRootPreferenceScreenPreference = rootPreferenceScreen.getPreference(i);

     // Check if current root preference screen preference is a preference category and if its key matches the
     // key we were specified
     if (currentRootPreferenceScreenPreference != null && currentRootPreferenceScreenPreference instanceof PreferenceCategory) {
      if (((PreferenceCategory) currentRootPreferenceScreenPreference).getKey() != null
        && ((PreferenceCategory) currentRootPreferenceScreenPreference).getKey().equals(customPrefScreen.preferenceCategoryName)) {
       // A match - this is the preference category where to add our custom preferences
       currentPreferenceCategory = (PreferenceCategory) currentRootPreferenceScreenPreference;
       break;
      }
     }
    }

    // Otherwise create a new preference category object
    if (currentPreferenceCategory == null) {
     // Instantiate a new preference category to hold the custom preference screen
     currentPreferenceCategory = new PreferenceCategory(context);
     currentPreferenceCategory.setTitle(customPrefScreen.preferenceCategoryName);
     
     // Add the new preference category to the end of the root preference screen
     rootPreferenceScreen.addPreference(currentPreferenceCategory);
    }

    // Add the contents of of the custom preference screen to the preference category
    int resId = context.getResources().getIdentifier(customPrefScreen.xmlResFilename, "xml", context.getPackageName());
    PreferenceScreen resCustomPreferenceScreen = inflatePreferenceScreenFromResource(context, resId);
    int order = currentPreferenceCategory.getPreferenceCount() - 1; // offset by the number of preferences already in the group so custom preferences are appended
    if (resCustomPreferenceScreen != null) {
     for (int j = 0 ; j < resCustomPreferenceScreen.getPreferenceCount(); j  ) {
      Preference preferenceWidget = resCustomPreferenceScreen.getPreference(j);
      preferenceWidget.setOrder(  order);
      currentPreferenceCategory.addPreference(preferenceWidget);
     }
    }
   }
  }
 }

 /**
  * Subclassed {@link PreferenceFragment} to load preferences for Android devices running 3.0 (Honeycomb API 11) and up.
  *
  */
 @TargetApi(11)
 public static class RootPreferencesFragment extends PreferenceFragment {
  protected Context mContext;
  protected ArrayList<CustomPreferenceScreen> mCustomPreferenceScreenResourcesList;

  public RootPreferencesFragment() {
  }

  @Override
  public void onCreate(final Bundle savedInstanceState) {
   super.onCreate(savedInstanceState);

   // Load the root preferences using PreferenceFragment.
   // These preferences are common to all games that use the library.
   addPreferencesFromResource(R.xml.root_preferences);

   // Render the custom preference screens and their widgets
   addCustomPreferenceScreens(mContext, getPreferenceScreen(), mCustomPreferenceScreenResourcesList);
  }
 }

 /**
  * Inflates a {@link android.preference.PreferenceScreen PreferenceScreen} from the specified
  * resource.<br>
  * <br>
  * The resource should come from {@code R.xml}.
  * 
  * @param context The context.
  * @param resId The ID of the XML file.
  * @return The preference screen or null on failure.
  */
 protected static PreferenceScreen inflatePreferenceScreenFromResource(Context context, int resId) {
  try {
   // The Android API doesn't provide a publicly available method to inflate preference
   // screens from an XML resource into a PreferenceScreen object so we use reflection here
   // to get access to PreferenceManager's private inflateFromResource method.
   Constructor<PreferenceManager> preferenceManagerCtor = PreferenceManager.class.getDeclaredConstructor(Context.class);
   preferenceManagerCtor.setAccessible(true);
   PreferenceManager preferenceManager = preferenceManagerCtor.newInstance(context);
   Method inflateFromResourceMethod = PreferenceManager.class.getDeclaredMethod(
     "inflateFromResource", Context.class, int.class, PreferenceScreen.class);
   return (PreferenceScreen) inflateFromResourceMethod.invoke(preferenceManager, context, resId, null);
  } catch (Exception e) {
   Log.w(TAG, "Could not inflate preference screen from XML resource ID "   resId, e);
  }

  return null;
 }
 
 /**
  * Convenience method to check whether the device's Android version >= API 11 (3.0 )
  * as the {@link android.support.v4.app.Fragment} API is only available since API 11.
  * 
  * @return true if Fragment and associated classes are supported, false otherwise
  */
 public static boolean isFragmentSupported() {
  if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.HONEYCOMB) {
   return true;
  }
  return false;
 }
}
Download Eclipse Game Project and Game Library Project.

Saturday 10 November 2012

Sharing a Single Eclipse Workspace on 2 Computers via Dropbox

I use 2 computers for development; my main office desktop running Windows 7 and my MacBook Air running OS X. When in my office I prefer the desktop's big keyboard and display for working on, but then I like to be able just to grab my Air and take it with me and be able to continue development on that somewhere else like the nearby beach cafe, the lounge for coding while watching TV etc without having to manually copy files and such each time I switch between computers.

I came up with a little hack to allow me to relatively easily share my Eclipse workspace using Dropbox. For those who don't know, Dropbox is a serivce that allows to you to automatically sync files like documents and photos across all your devices; computers, smartphones etc.

The major caveat of sharing a single Eclipse workspace across two different computers with two different operating systems is many of the workspace's settings are path dependent relative to the host operating system (think Windows \ versus OS X/Linux /). All these configurations are saved in the .metadata folder within the top of your workspace. I got around these issues by having two copies of this folder, one for the Windows based workspace and one for the OS X based workspace. A wrapper script copies the relevant folder before starting Eclipse and backs up a copy to save any new changes after Eclipse exits. This approach means you cannot have both Eclipse's running simultaneously on both machines otherwise the workspace's configuration may get corrupted.

Generally any changes made to a project on one machine will then be reflected on the other machine as well. Try to avoid using hardcoded paths when configuring the build paths for a project. Instead use Eclipse's User Libraries as they will be saved unique to each operating system's workspace settings.

Some people may say why bother with such a setup like this and instead just use a version control system. I am intending to also use Git for my projects but for tasks that aren't ready to be committed I think this Dropbox approach is fine to allow semi-seamless transition between two development machines.

Steps to Share an Eclipse Workspace On Windows and OS X Machines

These steps should be applicable to a Windows/Linux combination as well, substituting Linux in place of OS X.
  1. Install the same version of Eclipse separately on each machine.
  2. Sign up to Dropbox and install the client software on both computers.
  3. Make a folder in your dropbox for your workspace (e.g. C:\User\soliax\Dropbox\eclipse-workspace or /Users/soliax/Dropbox/eclipse-workspace). If Dropbox is setup properly, you'll only have to create it on one computer and then it will automatically be sync'd (created) on the other computer.
  4. Launch Eclipse on one of your computers, lets say Windows, and specify the location of your workspace folder. Optionally, you may setup workspace configurations at this time like installing any plugins, setting classpaths etc. Exit Eclipse when you're finished.
  5. With Eclipse closed, rename the .metadata folder to .metadata-windows.
  6. Repeat steps 4. and 5. but on your OS X computer and rename the folder to -osx, not -windows.
  7. Create a batch script on your Windows computer and save it as something like eclipse-dropbox.bat in the Eclipse installation folder. This is the wrapper script to start Eclipse. Copy/paste the following contents in. Configure the WORKSPACE variable to point to your actual workspace folder in your dropbox.
    @echo off
    REM Eclipse startup script for workspaces shared amongst two computers on Dropbox.
    REM Set path of your Eclipse workspace
    set WORKSPACE=%USERPROFILE%\Dropbox\Dev\workspaces\aroha
    echo Restoring Windows metadata for Eclipse...
    @call xcopy /HIEQRY "%WORKSPACE%\.metadata-windows" "%WORKSPACE%\.metadata"
    call eclipse.exe
    echo Backing up Windows metadata for Eclipse...
    @call xcopy /HIEQRY "%WORKSPACE%\.metadata" "%WORKSPACE%\.metadata-windows"
    
  8. If you want to pin this script to your Windows taskbar, try the following steps:
    1. Rename the *.bat script to *.exe. Then drop the icon on to your taskbar. For some silly reason Windows doesn't allow bat/cmd files to easily be pinned.
    2. Rename the script back to *.bat.
    3. Right-click the pinned icon, then right click the filename at the top of the menu (e.g. eclipse-dropbox) and choose Properties from the next menu.
    4. Rename the .exe extension to .bat.
    5. Change the icon as well if you wish.
  9. Create a shell script on your OS X computer and save it in the anywhere (I used /usr/bin) with the name of something like eclipse-dropbox. This is the wrapper script to start Eclipse. Make sure you use the command chmod +x in a Terminal window or something similar to make the script executable. Copy/paste the following contents in.This assumes Eclipse is installed in your Applications folder. Configure the WORKSPACE variable to point to your actual workspace folder in your dropbox.
    #!/bin/sh
    # Eclipse startup script for workspaces shared amongst two computers on Dropbox.
    
    # Set path of your Eclipse workspace
    WORKSPACE=$HOME/Dropbox/Dev/workspaces/aroha
    
    # Copy the backup .metadata folder used for the OS X version of Eclipse to the actual working copy
    echo Restoring OS X metadata for Eclipse...
    rm -rf "${WORKSPACE}/.metadata"
    cp -Rfp "${WORKSPACE}/.metadata-osx" "${WORKSPACE}/.metadata"
    
    # Launch Eclipse
    open -W /Applications/Eclipse
    
    # Backup the working copy of .metadata for the OS X version of Eclipse when Eclipse closes
    echo Backing up OS X metadata for Eclipse...
    rm -rf "${WORKSPACE}/.metadata-osx"
    cp -Rfp "${WORKSPACE}/.metadata" "${WORKSPACE}/.metadata-osx"
    
  10. If you want to add the OS X shell script to your dock, use Automator to create an application that has a single job to execute a shell script and point it to the script you created in step 9. Then you can drop that application onto your dock.
Now use these wrappers to start Eclipse. Make sure Eclipse is closed on the other computer before starting it. As the workspaces are technically different, any changes you make to one you will need to also do to the other. If you open a project on Windows, it will not automatically mean that project will be open next time you launch the OS X workspace. Changes you make to source code files and project specific settings should automatically be reflected (assuming to allow enough time to transfer to/from Dropbox's servers).

Wednesday 17 October 2012

OS X Development Environment for Vuforia Android Apps + Streaming Video Playback Sample

Qualcomm has instructions on setting up an environment to develop Vuforia based apps for Android (for those of you unaware, Qualcomm's mobile augmented reality SDK was renamed from QCAR to Vuforia). It focuses on Windows platforms with some tips/hints for Linux and OS X. This blog article will focus on setting up the same kind of development environment but on OS X. The plan is to document as I give it ago and give a bit more in-depth detail. At the end I'll cover running the Video Playback sample app and modifying it slightly to stream video off of the internet. I'll also cover debugging native code from within Eclipse.

Components to Install

  • JDK (1.6.0_35)
  • Eclipse IDE for Java (Juno)
  • Android SDK (r20)
  • Android ADT (Eclipse plugin for r20)
  • Android NDK (r8b - native development kit )
  • Android CDT (C/C++ development tools)
  • Vuforia SDK (1.5.9 - Qualcomm's augmented reality SDK)
  • Xcode command line/developer tools

Java, Eclipse and Android SDK

The Qualcomm setup docs suggest to use Java 1.7 but I'm going to stick with the stock standard JDK that Apple bundled with my Mountain Lion (Java 1.6.0_u35-b10 on OS X 10.8.2). They also suggest to use the the 32-bit version of Eclipse for Mac but I've already got the 64-bit of Juno so am going to stick with that too.

There's already a plethora of info on setting up a standard Android development environment with the Eclipse IDE and Android SDK so I suggest either searching Google or going straight to the official Android docs site for help with this first part. At least, you will require the Android SDK, Eclipse for Java and ADT (the Eclipse Android Development Tool) plugin installed.

Note: install the NDK Plugins at the same time as you install the ADT plugin in Eclipse.


Android NDK Installation

The following instructions assume you already have a basic Android development environment setup with Eclipse.

At the native level, Android runs its programs that were originally written in C/C++ (and no doubt some ASM). The majority of Android apps are written purely in Java, running on on the Android-optimized JVM called Davlik. Using the NDK (native development kit), developers can also themselves write part of their apps in C/C++. Such apps would be a blend of Java and C/C++ glued together with JNI - the Java Native Interface which is the bridge between code written in the two languages.

Vuforia itself is written in C++. The SDK library was no doubt written in native code because of the increased performance for its critical features like the computer vision algorithms and OpenGL code.

In order to compile NDK code on OS X you'll need Xcode and its Command Line Tools installed for programs like the GNU compiler gcc and the build tool make.

  1. Download Xcode from here and once you've installed it goto Xcode -> Preferences -> Downloads and choose to install Command Line Tools (hopefully it's in your list). I am using Xcode 4.5.1. Alternatively, although I haven't tested it, you can try downloading and installing Xcode bundled with the Developer Tools package from here. In either case, you'll probably need to create a free developer account with Apple.
  2. Download the Android NDK for OS X from here and unzip the archive in the same folder as you installed the Android SDK. I installed version r8b under /Users/wocky/Documents/android/android-ndk-macosx.
  3. The installation path needs to be added to the environment variable called PATH. Do so by adding it to the end of the PATH configuration in ~/.bash_profile. For example: PATH=$PATH:/Users/wocky/Documents/android/android-sdk-macosx/platform-tools/:/Users/wocky/Documents/android/android-ndk-macosx; export PATH
  4. Change to the samples/san-angeles directory (under the NDK installation directory) and run the command ndk-build from a terminal window to ensure the path setting in step 3 is correct. You should see some output saying the San Angeles project's C files have been compiled and the samples/san-angeles/libs/armeabi/libsanangeles.so library created.
  5. Next we will install the CDT plugin for Eclipse. This is the C/C++ Development Tool plugin. Goto Help -> Install Software and choose the URL of the version that matches your Eclipse (http://download.eclipse.org/releases/juno for Juno). Under the Programming Languages category check C/C++ Development Tools and proceed to install it. Eclipse will need to be restarted after the installation (it should prompt you anyway).
  6. Install the NDK Plugins for Eclipse that can be found on the same site as the ADT plugin. Following the same steps as above, the URL will be something like https://dl-ssl.google.com/android/eclipse.
  7. Lastly, tell Eclipse where the NDK is installed by specifying the installation path in Eclipse -> Preferences -> Android -> NDK.
I won't go over the specifics of creating a new Android NDK project but here are some more articles to reference. They cover technical details of NDK/JNI and and setting up the environment and debugging. I haven't ran through these articles myself so can't attest for their validity.


Vuforia Augmented Reality SDK Setup

  1. Download the SDK for OS X from here. You will probably need to register a free developers account with Qualcomm. I am using SDK version 1.5.9.
  2. Once the package was downloaded, I unarchived it and ran its GUI installation program. I chose the same folder as the Android SDK and NDK to house Vuforia as well. It's /Users/wocky/Documents/android/vuforia-sdk-android-1-5-9 in my case.
  3. We need to make Eclipse aware of where the Vuforia SDK is. Open Eclipse ->Preferences -> Java -> Build Path -> Classpath Variables and create a new variable called QCAR_SDK_ROOT and point it to your installation folder.


Running Sample VideoPlayback App and Debugging with Eclipse

The Android NDK section above gives some links at the bottom to other blogs as an excellent source for learning more about the basics of Android NDK projects' structure and inner workings. I'm going to end this particular blog article documenting a couple of areas of interest; first the video playback feature that was made available in Vuforia 1.5 and debugging NDK applications from within Eclipse - something I had a lot of trouble with trying to setup on my Windows box a few months back.

Qualcomm announced a sample video playback app for Android and iOS earlier this year here. I downloaded the sample app for Android and placed the VideoPlayback directory in the zip file under the samples folder in the Vuforia SDK installation path.

Installing the App

  1. Download the Eclipse project from the announcement link above (download).
  2. In Eclipse, open File -> Import -> Android -> Existing Android Code Into Workspace and search for the folder where you copied the sample VideoPlayback app to.
  3. Rename the Eclipse project to something simply like VideoPlayback if you prefer, instead of the default Java package name that Eclipse tends to use for imported projects.
  4. Open a terminal and goto the VideoPlayback directory and run the ndk-build command to build the native library code. The sources should compile with no errors.
  5. Right-click the VideoPlayback project in Eclipse and choose Refresh to make Eclipse aware of the new files created by step 4.
  6. Right-click again and select Run As -> Android application to run it on your device.
  7. You'll need either a printout (or display the image on your screen) of the stones and/or chips targets that you can find in samples/VideoPlayback/media under the Vuforia installation directory.
  8. Once the VideoPlayback app is running on your Android and you see the video input from the camera, point the camera at the target and you should then see a play button to play a video like:
  9. As an optional step you can modify the VideoPlayback sample to stream a video from the internet rather than play the mpeg4 videos packaged with the app. 
Streaming Video Media from the Internet

To stream a video from the internet instead of playing the default mpeg4 movies included in the sample app, open VideoPlaybackHelper.java and modify the following the load method to set the MediaPlayer object's data source to a file on the internet and comment out the lines that load the movie files from the asset resources.

try
{
    mMediaPlayer = new MediaPlayer();

    // This example shows how to load the movie from the assets folder of the app
    // However, if you would like to load the movie from the sdcard or from a network location
    // simply comment the three lines below
    //AssetFileDescriptor afd = mParentActivity.getAssets().openFd(filename);
    //mMediaPlayer.setDataSource(afd.getFileDescriptor(),afd.getStartOffset(),afd.getLength());
    //afd.close();

    // and uncomment this one
    mMediaPlayer.setDataSource("http://people.sc.fsu.edu/~jburkardt/data/mp4/cavity_flow_movie.mp4");
    ...
    }

I used this video as a sample. The file needs to be either over HTTP or RTSP and capable of progressive downloads (otherwise streaming wouldn't work). Here's a screenshot of it with the chips marker being display on my Mac air's screen.


I was even able to play a flash video so potentially YouTube videos could be used if you know the direct link to the FLV file (but apparently that breaches their TOCs). This option may only available on later versions of Android. I used ICS (4.0.4) on my Galaxy S3 but it didn't work on my 2.3.4 Nexus S.

Debugging Native Code in Eclipse

About a year ago when I first started playing with Vuforia and Android's NDK I remember it was a real nightmare to get debugging working. At that time I was using Eclipse on Windows 7. It seems the ADT plugin has come a wee way since then and the process has been simplified. I don't actually recall a direct "Run As -> Android native application" option. If memory serves me right, you needed to put a breakpoint in the Java code right before the C/C++ native code was called and then doing a normal Java debug session, it would jump into the native code space. Here's a summary of what I needed to do get ADT r20 running on my new setup:
  1. Ensure you have set the path to the NDK installation in Eclipse's Android preferences and the CDT plugin is installed.
  2. Right-click the VideoPlayback project and select Android Tools -> Add native support.
  3. Eclipse will most likely automatically switch from the Java to C/C++ perspective (think of an Eclipse perspective is like an "editing mode" for that particular language/feature).
  4. If you are using version r20 of the Android ADT Eclipse plugin there's a bug that will give you a bunch of warnings and errors with the cpp files in the jni folder. The actual building of these source files is handled externally by the ndk-build script but if you use Eclipse as an editor it's clutter on the screen you don't need. Following the advice here, I disabled all the items under project properties -> C/C++ General -> Code Analysis as a temporary work around.
  5. Add a breakpoint somewhere in one of the cpp files. I chose the beginning of the 
    Java_com_qualcomm_QCARSamples_VideoPlayback_VideoPlayback_initApplicationNative function in VideoPlayback.cpp.
  6. Lastly, select the project in Eclipse's package explorer, right-click and choose Debug As -> Android Native Application.
    That's all folks.

    Monday 15 October 2012

    Catchup

    Albeit a year has almost past since I wrote Android AR 101. Since then I've become a father, worked in the UK and found myself back in Fukuoka doing AR related development again. I've also picked up iOS programming and released my first iPhone app (FlagIt! - a travel app to flag countries visited or on your wishlist of places to travel). I've moved my daily development over to my recently acquired MacBook Air. Never having owned my own Mac before, I purchased it mainly for my iOS work but have fallen in love with it. It's compactness, lightness, touch pad and long last battery have really turned me into a can-program-anywhere kinda person. Whether that be the beach or local pub with beer in hand :)

    Wednesday 25 July 2012

    Get country name from coordinates or coordinates from search term using Objective-C and Google API

    One project I have been working on recently saw me messing around with Objective-C as I created an app using the MKMapView for interacting with Google Maps. I'm only hoping that when I release the app Apple aren't gonna be stupid and reject my app saying I might as well rewrite it using their new APIs as they're doing away with Google maps in iOS6. Oh well, we'll see...

    Anyway, this time around I am posting a couple of methods that outline how I retrieve a country name by doing a JSON query passing in the latitude and longitude of a place, and secondly how I get the coordinates to a search term (a string).

    As usual I'll let the code speak for itself...

    To get the string name of a country from latitude and longitude, also with the ability to specify which language you get the country name back in...
    // Do reverse geocode lookup on latitude/longitude to get country name based on specified locale
    - (NSString*)getCountryNameFromCoordinates:(CLLocationCoordinate2D)coordinates locale:(NSString*)language;
    {
        NSLog(@"Querying Google location API for %@ country name for latitude %f and longitude %f...", language, coordinates.latitude, coordinates.longitude);
        
        // Get JSON contents from Google API
        NSString *url = [[NSString stringWithFormat:@"http://maps.googleapis.com/maps/api/geocode/json?latlng=%f,%f&sensor=false&language=%@", coordinates.latitude, coordinates.longitude, language] stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
        NSLog(@"Query URL is: %@", url);
        NSError *error = nil;
        NSString *jsonString = [NSString stringWithContentsOfURL:[NSURL URLWithString:url] encoding:NSUTF8StringEncoding error:&error];
        if (error != nil)
        {
            NSLog(@"Error doing reverse geocode lookup on %f, %f, lang=%@: %@", coordinates.latitude, coordinates.longitude, language, error.localizedDescription);
            return nil;
        }
    
        // Extract country by parsing JSON data
        SBJsonParser *jsonParser = [[SBJsonParser alloc] init];
        error = nil;
        NSDictionary *jsonObjects = [jsonParser objectWithString:jsonString error:&error];
        if (error != nil)
        {
            NSLog(@"Error parsing JSON response to reverse geocode lookup: %@", jsonParser.error);
            
            [jsonParser release];
            [jsonObjects release];
    
            return nil;
        }
        NSDictionary *results = [jsonObjects objectForKey:@"results"];
        NSString *country = nil;
        for (NSDictionary *item in results)
        {
            if ([[item objectForKey:@"types"] containsObject:@"country"])
            {
                country = [[[item objectForKey:@"address_components"] objectAtIndex:0] objectForKey:@"long_name"];
                NSLog(@"Found matching country name: %@", country);
                [item release];
                break;
            }
            
            [item release];
        }
        if (country == nil) {
            NSLog(@"Couldn't find a matching country name");
            
            [jsonParser release];
            [jsonObjects release];
            [results release];
            [country release];
            
            return nil;
        }
        
        [jsonParser release];
        [jsonObjects release];
        [results release];
        
        return country;
    }
    

    And to center your map (or whatever behaviour you so desire) on the coordinates obtained from a search on a specified location string...

    
    - (void)searchCoordinatesForAddress:(NSString *)inAddress
    {
        NSLog(@"Querying Google location API for latitude/longitude coordinates for search term %@", inAddress);
        
        // Get JSON contents from Google API
        NSString *url = [[NSString stringWithFormat:@"http://maps.googleapis.com/maps/api/geocode/json?address=%@&sensor=false", inAddress] stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
        NSLog(@"Query URL is: %@", url);
        NSError *error = nil;
        NSString *jsonString = [NSString stringWithContentsOfURL:[NSURL URLWithString:url] encoding:NSUTF8StringEncoding error:&error];
        if (error != nil)
        {
            NSLog(@"Error doing coordinate lookup on %@: %@", inAddress, error.localizedDescription);
            return;
        }
        
        // Extract coordinates by parsing JSON data
        SBJsonParser *jsonParser = [[SBJsonParser alloc] init];
        error = nil;
        NSDictionary *jsonObjects = [jsonParser objectWithString:jsonString error:&error];
        if (error != nil)
        {
            NSLog(@"Error parsing JSON response to extract coordinates: %@", jsonParser.error);
    
            [jsonParser release];
            [jsonObjects release];
            
            return;
        }
        NSArray *results;
        NSDictionary *location;
        @try {
            results = [jsonObjects objectForKey:@"results"];
            location = [[[results objectAtIndex:0] objectForKey:@"geometry"] objectForKey:@"location"];
        }
        @catch (NSException *exception) {
            NSLog(@"Could not find searched location's coordinates");
            [jsonParser release];
            [jsonObjects release];
            return;
        }
    
        // Get latitude and longitude as double values
        double longitude = 0.0;
        double latitude = 0.0;   
        NSNumber *lat = [location objectForKey:@"lat"];
        NSNumber *lng = [location objectForKey:@"lng"];
        latitude = [lat doubleValue];
        longitude = [lng doubleValue];
        [lat release];
        [lng release];
        
        [jsonParser release];
        [jsonObjects release];
        [results release];
        
        if (longitude == 0 && latitude == 0) {
            NSLog(@"Could not find searched location's coordinates");
            return;
        }
        NSLog(@"Coordinates found for searched location: %f %f", latitude, longitude);
        
        // I zoom my map to the area in question.
        [self zoomMapAndCenterAtLatitude:latitude andLongitude:longitude];
    }
    

    News

    This blog hasn't had any activity for the past few months. After my wife and I embarked on our round the world trip back in February, stopping off home in New Zealand for our second wedding, we found out she was pregnant :)

    We are delighted to say we will be expecting our first baby boy in October. We didn't end up completing most of the trip we had planned and instead I've been working for the past few months out of the UK. I will be moving back to Japan later this year and hope to refocus my efforts on augmented reality work then. Not that I am expecting much free time with fatherdom looming, but I will try and post the odd helpful article when I can!

    Monday 23 January 2012

    Creating a custom Android Intent Chooser

    I recently added a feature to my Zedusa image resizer application that required me to write a custom intent chooser. The reason for this was so I could add a checkbox on the list so the user could choose to make the application they selected the default one going forward.


    I thought I'd share a snippet of the code on my blog as it could be useful for some. Feel free to ask any questions about the inner workings of it, otherwise I'll leave it up to you to read and understand. Please be aware I stripped some code out so I haven't actually ran the exact code below on my phone.

    public void startDefaultAppOrPromptUserForSelection() {
     String action = Intent.ACTION_SEND;
    
     // Get list of handler apps that can send
     Intent intent = new Intent(action);
     intent.setType("image/jpeg");
     PackageManager pm = getPackageManager();
     List<ResolveInfo> resInfos = pm.queryIntentActivities(intent, 0);
    
     boolean useDefaultSendApplication = sPrefs.getBoolean("useDefaultSendApplication", false);
     if (!useDefaultSendApplication) {
      // Referenced http://stackoverflow.com/questions/3920640/how-to-add-icon-in-alert-dialog-before-each-item
    
      // Class for a singular activity item on the list of apps to send to
      class ListItem {
       public final String name;
       public final Drawable icon;
       public final String context;
       public final String packageClassName;
       public ListItem(String text, Drawable icon, String context, String packageClassName) {
        this.name = text;
        this.icon = icon;
        this.context = context;
        this.packageClassName = packageClassName;
       }
       @Override
       public String toString() {
        return name;
       }
      }
    
      // Form those activities into an array for the list adapter
      final ListItem[] items = new ListItem[resInfos.size()];
      int i = 0;
      for (ResolveInfo resInfo : resInfos) {
       String context = resInfo.activityInfo.packageName;
       String packageClassName = resInfo.activityInfo.name;
       CharSequence label = resInfo.loadLabel(pm);
       Drawable icon = resInfo.loadIcon(pm);
       items[i] = new ListItem(label.toString(), icon, context, packageClassName);
       i  ;
      }
      ListAdapter adapter = new ArrayAdapter<ListItem>(
        this,
        android.R.layout.select_dialog_item,
        android.R.id.text1,
        items){
    
       public View getView(int position, View convertView, ViewGroup parent) {
        // User super class to create the View
        View v = super.getView(position, convertView, parent);
        TextView tv = (TextView)v.findViewById(android.R.id.text1);
    
        // Put the icon drawable on the TextView (support various screen densities)
        int dpS = (int) (32 * getResources().getDisplayMetrics().density   0.5f);
        items[position].icon.setBounds(0, 0, dpS, dpS);
        tv.setCompoundDrawables(items[position].icon, null, null, null);
    
        // Add margin between image and name (support various screen densities)
        int dp5 = (int) (5 * getResources().getDisplayMetrics().density   0.5f);
        tv.setCompoundDrawablePadding(dp5);
    
        return v;
       }
      };
    
      // Build the list of send applications
      AlertDialog.Builder builder = new AlertDialog.Builder(this);
      builder.setTitle("Choose your app:");
      builder.setIcon(R.drawable.dialog_icon);
      CheckBox checkbox = new CheckBox(getApplicationContext());
      checkbox.setText(getString(R.string.enable_default_send_application));
      checkbox.setOnCheckedChangeListener(new OnCheckedChangeListener() {
    
       // Save user preference of whether to use default send application
       @Override
       public void onCheckedChanged(CompoundButton paramCompoundButton,
         boolean paramBoolean) {
        SharedPreferences.Editor editor = sPrefs.edit();
        editor.putBoolean("useDefaultSendApplication", paramBoolean);
        editor.commit();
       }
      });
      builder.setView(checkbox);
      builder.setOnCancelListener(new OnCancelListener() {
    
       @Override
       public void onCancel(DialogInterface paramDialogInterface) {
        // do something
       }
      });
    
      // Set the adapter of items in the list
      builder.setAdapter(adapter, new DialogInterface.OnClickListener() {
       @Override
       public void onClick(DialogInterface dialog, int which) {
        SharedPreferences.Editor editor = sPrefs.edit();
        editor.putString("defaultSendApplicationName", items[which].name);
        editor.putString("defaultSendApplicationPackageContext", items[which].context);
        editor.putString("defaultSendApplicationPackageClassName", items[which].packageClassName);
        editor.commit();
    
        dialog.dismiss();
    
        // Start the selected activity sending it the URLs of the resized images
        Intent intent;
        intent = new Intent(Intent.ACTION_SEND);
        intent.setType("image/jpeg");
        intent.setClassName(items[which].context, items[which].packageClassName);
        startActivity(intent);
        finish();
       }
      });
    
      AlertDialog dialog = builder.create();
      dialog.show();
    
    
     } else { // Start the default send application
    
      // Get default app name saved in preferences
      String defaultSendApplicationName = sPrefs.getString("defaultSendApplicationName", "<null>");
      String defaultSendApplicationPackageContext = sPrefs.getString("defaultSendApplicationPackageContext", "<null>");
      String defaultSendApplicationPackageClassName = sPrefs.getString("defaultSendApplicationPackageClassName", "<null>");
      if (defaultSendApplicationPackageContext == "<null>" || defaultSendApplicationPackageClassName == "<null>") {
       Toast.makeText(getApplicationContext(), "Can't find app: "  defaultSendApplicationName  
         " ("   defaultSendApplicationPackageClassName   ")", Toast.LENGTH_LONG).show();
    
       // don't have default application details in prefs file so set use default app to null and rerun this method
       SharedPreferences.Editor editor = sPrefs.edit();
       editor.putBoolean("useDefaultSendApplication", false);
       editor.commit();
       startDefaultAppOrPromptUserForSelection();
       return;
      }
    
      // Check app is still installed
      try {
       ApplicationInfo info = getPackageManager().getApplicationInfo(defaultSendApplicationPackageContext, 0);
      } catch (PackageManager.NameNotFoundException e){
       Toast.makeText(getApplicationContext(),  "Can't find app: "   defaultSendApplicationName  
         " ("   defaultSendApplicationPackageClassName   ")", Toast.LENGTH_LONG).show();
    
       // don't have default application installed so set use default app to null and rerun this method
       SharedPreferences.Editor editor = sPrefs.edit();
       editor.putBoolean("useDefaultSendApplication", false);
       editor.commit();
       startDefaultAppOrPromptUserForSelection();
       return;
      }
    
      // Start the selected activity
      intent = new Intent(Intent.ACTION_SEND);
      intent.setType("image/jpeg");
      intent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
      intent.setClassName(defaultSendApplicationPackageContext, defaultSendApplicationPackageClassName);
      startActivity(intent);
      finish();
      return;
     }
    }
    

    Friday 20 January 2012

    ARWedding - My first AR app


    I had a bit of a hiatus on AR stuff while I got caught up doing some development for other Android apps. I'm getting married in a week so I thought I'd kick off by making an adaption of Qualcomm's QCAR (since renamed Vuforia) sample app ImageTargets into a novelty wedding gift. It simply shows a photo of the two of us for the splash screen and a rose as the 3D model. I'll be releasing both Android and iPhone versions which you can find from the corresponding links on the top-right of my blog. The links to the projects' sources are below.

    I won't go into detail of how I did what as that's just a little bit too time consuming with lots of wedding preparations to do;-) If you do have any specific questions feel free to drop a comment and I'll try and reply when I can.

    The customizations I did to the Android/iPhone verions are:

    • Custom app icon
    • Custom 3D model of a rose to replace the teapot
    • Inserted rose coloured textures (ideally I'd want to have multiple textures; one for the green of the stalk and another for the red of the flower but that turned out to be a more advanced topic that I decided to put aside for a rainy day)
    • Changed the trackables (markers) to a QR code (see below) and the faces on 1000, 5000 and 10,000 yen Japanese notes
    • Played with the kObjectScale to get a better sized rose and rotated the projection matrix to make the rose appear as if it was standing upright

    Here are the links to the projects' source code. I was using the QCAR SDK 1.5.4 beta1, the latest at the time of coding.

    • Android project for Eclipse. I was using Eclipse Helios testing on my Nexus S 2.3.4 phone and my Asus Transformer EEE TF101 tablet running Android 3.2. You may need to edit the NDK settings so it can find the QCAR SDK properly.
    • iPhone project for Xcode. I was using Xcode 3.2.3 and tested on my jailbroken iPhone 3GS running iOS 4.1 and on my iPhone4 running iOS 5. The project was in the same folder as the QCAR SDK sample projects, I'm not sure whether this would involve some settings changes if you use a different folder location.

    Besides the Japanese notes you can also use the following QR code as a marker. The colour of the rose associated with the QR code is red whereas the money trackables correspond with a red-green coloured texture.

    Friday 6 January 2012

    Error 9015 on the Oneworld.com Round the World site

    I thought it might be time for my first non-programming article. My wife and I are going to be doing a round the world trip for our honeymoon next month. I've been spending literally hours at One World's online RTW interactive planner trying to work out the cheapest and best route. It's quite a fun tool just to play with as there are so many destinations and heaps of permutations. The route we eventually settled is the map below start in Seoul:

     You can see the little message at the bottom which basically means everything's validated - ie. all the flights/stopovers and paths don't break any of One World's RTW ticket rules and regulations. And trust me there's heaps. From a programmer's point of view I appreciate how complex such a system is and trying to code in the conditions for all that business logic must be a nightmare designing and coding. With a system this big it's inevitable that there will be some bugs in the system.

    Anyway, onto the title of this post. After I validated my journey, confirmed the price etc. I went ahead with entering our personal info and then my credit card info. On the very last step of purchasing it sat there for about 3-5 minutes processing and then came back with error "9015" telling me to contact my Travel Assistance Desk for help.

    The problem is this site doesn't seem to have a single point of contact for help. Eventually I rang American Airlines and got through to their RTW hotline (phone number +1800 247 3247) and the lady I spoke with was at least familiar with the system somewhat and had a list of common error codes on hand. Unfortunately 9015 wasn't in that list. She tried manually booking my ticket for me but as the ticket wasn't starting in the US, it was starting in South Korea, they needed to get a quote from their office there. The base tariffs and taxes etc. are based on how many continents you stop in and differ greatly depending on the country you start in (Seoul was a very cheap place to start for people starting in Asia - I just catch the boat to Korea and hop on the RTW from there:).

    To cut a long story short, the crux of the story is I wasn't able to purchase my RTW ticket in Korea using the online planner with my credit card because I needed either a) a Korean credit card (the billing address must be there; mine was Japanese) or b) turn up in person with my credit card to the Seoul office of American Airlines which was a non-option because I won't be in Seoul until 2 days before my world trip begins.

    I now am guessing that the 9015 error was due to my credit card's billing address not being in the same country as the starting point of my journey. In the end I booked on the phone through a friend's workplace in London for a sweet deal. The guys @ www.roundtheworldexperts.co.uk are great!