Sunday, June 29, 2014

Cecilia's Drawing Toy

Image from the game: a smiling sun on the right, and a set of color markers on the left.
My sister- and brother-in-law recently celebrated the birthday of their second daughter, Cecilia. She's not quite old enough to care about the touchscreen devices yet, but I figured I'd get a head-start on this one. So here you go: Cecilia's Drawing Toy.

Much like its "sister" software, Frances's Tracing Game, this one is available for download on the Play Store, or you can poke through the source code on GitHub. You can select a color and draw with your finger, then to erase, hold the whole device upside-down and shake it (like another drawing toy you might have heard of ;) ).


I was able to re-use quite a bit of the source code from Frances's Tracing Game for this one, so the build cycle wasn't quite as long. I feel like I'm pretty much getting the hang of the drawing toolkit for Android; this one should play more nicely than the first versions of Frances's Tracing Game did with different screen sizes and form factors. It's missing a way to save images right now (but you can do so with the "screenshot" feature on most phones).

If you thumb through the source code, you'll find some goodies. It's all licensed under Apache, so feel free to grab any pieces and use them in your own projects (with proper sub-licensing and attribution). In particular, these modules are broken out and intended for re-use:


  • RandomSound: Give it a collection of sound resources and it will select, load, and play random ones continuously. Useful for footsteps, etc. There's some lag between the end of one and the beginning of the next that I should clean up.
  • OscillationSensor: Detects back-and-forth jerks in a linear direction. Useful for detecting device shaking.
  • FaceDownSensor: Detects when the device's screen is angled towards the floor, and sets a simple flag that the render thread can check.
As always, I dug up some likes and dislikes as I put this together.

Things I Liked

  1. Code re-use worked as intended. :) Both the initial layout of the project and the push to the Play Store were a lot simpler with templates ready to go.
  2. Instead of wrestling with Windows to build this one, I put together a development environment in a virtual machine running Debian Linux (with my favorite tool suite---emacs, git, and the command-line SDK---installed from packages and ready to go). It turned out to be a smoother development environment, to my mind; I spent less time thrashing about with the command line and more time writing code.
  3. Build-test cycle on a connected Android device is super-simple. I just kept "adb logcat" running in one console to spot-check installation failures, had "ant debug && ant installd" as my compilation command, and off I went.

Issues

  1. The timestamp on accelerometer events is extremely wonky. As I discussed previously, I eventually ignored them and tagged every incoming sensor event with a datetime as the sensor thread got ahold of it; not accurate, but sufficient for OscillationSensor.
  2. The Play Store makes you (hypothetically) put together a lot of images and material for a proper launch. Screenshots for phone, for tablet, for 10" tablet, promo image, icon, big super-awesome icon if you end up in the "Featured" section... It'd be cool if at least some of that could be dynamically generated. I spent more time preparing and uploading PNGs than I did writing RandomSound.

Future Ideas

This'll be a fun one to keep updating, but really, the list of new things is small on this one:

  • save the current image when the app quits or is killed
  • allow for mixing of your own marker colors
  • eraser (besides holding upside down and shaking)
  • undo

Enjoy!

If you have a little one in your life, feel free to grab the toy and let me know if they like it! Feedback is welcome on the game's Play Store page, here, or on Plus. Go make something cute. ;)



Sunday, June 1, 2014

Quirkiness of Android sensor library timestamps

Update 2014-06-23: Original post doesn't quite work as intended. Details at the end of the post.

I'm working on a sensor library that should give me the last timestamp at which some interesting sensor event happened.

So, the Android sensor library is a little quirky. When you get a sensor event (on a sensor thread, so it is not recommended that you do any high-impact processing there), the event has an associated timestamp. The timestamp is documented thusly: "The time in nanosecond at which the event happened". The grammatically-bizarre English should be your clue that these docs may be slightly off. ;)

It turns out after a bit of Googling (tip o' the hat to StackOverflow, as usual) that the timestamp one receives isn't based off of any particular 0-point defined in the Android OS or the API; it's an arbitrary per-sensor value intended to allow for different measurements from the same sensor to be compared, not for the measurements to be compared to other timestamped events. It's a known issue (bug concerning the documentation; bug concerning the behavior), but as of right now it's the way of the world for Android developers.

If you need to obtain reference to some external timeframe (such as the oh-so-ubiquitous "milliseconds since 1970"), you'll have to put together your own workaround. The solution that's working for me right now is:

  1. Initialize your sensor listener with empty offset values dateBase and timestampBase.
  2. For each received sensor value (i.e. each call to onSensorChanged), check if your offsets are empty
  3. If they are, make a one-time call to populate them:
    1. grab a time pretty close to when the sensor event was fired via dateBase = (new Date()).getTime()
    2. Record timestampBase = event.timestamp from the new sensor event.
  4. To log the milliseconds-since-1970 timestamp of new events, retain (event.timestamp - timestampBase) / 1000000L + dateBase
This calculation is prone to fail if for some reason onSensorChanged gets called too long after the sensor change occurs (it's fine if the difference is on the order of nanoseconds and gets significantly less fine in the 100ms-second range or higher). But it works for my needs.

Update: So it turns out that this approach also doesn't work as desired; if the phone goes to sleep, the sensor clock will skew relative to wall time (I assume the sensor clock isn't "ticking" while the phone is not active). I could detect sleep / wake events, but it becomes a guessing game regarding what scenarios could cause the phone to stop updating the sensor clock. My new plan is to do (new Date()).getTime on every sensor event and take the simple performance hit; for my purpose, I don't need highly-accurate sensing.