Archive for the ‘web app mobile browser’ Tag

mLearning: ‘Using Your Senses’

sensor

Much of mLearning has to do with repurposing existing content or building modules similar to other eLearning modules. I see the value in that particularly for compliance training and other types of information dissemination. Sometimes learner’s needs are met by simply having content available in multiple places (i.e. desktop and mobile device). So I don’t discount the value of mobile courseware, I just think that the design community often forgets to think about the differentiation between mobile devices and desktop computers. Besides the always on, always connected, always with you nature of mobile devices, they also have a number of different sensors that we can utilize in our learning design.

Through the browser, you have access to geolocation through the devices location sensors (GPS and WiFi can be used to access the location of the device  http://mobile.tutsplus.com/tutorials/mobile-web-apps/html5-geolocation/). And with native applications, you can access the camera, accelerometer, GPS location and any other sensors on the device.

So how can you start to use these in your designs? That’s my question to the reader. Obviously, there are several considerations when building a mobile learning application. You may have some great ideas for sensor usage, but you may not have the staff to build the application or maybe you are building the application yourself (like me) but you know you’re going to have to build a web app so multiple devices can access it outside of an app store. These considerations are just a few of those contributing factors to your design. But let’s not throw away the idea of using sensors in our design. Instead, consider how the user would benefit from the use of sensors for contextual relevance, documentation, interactivity and engagement. From there, you can walk your design back to the realities imposed by the resources you have to work with from a development and implementation standpoint. My hunch is that you will find that if you consider these additional sensors at the beginning of your design process, you will end up with a better mobile design in the end. 

So the question again: How would you leverage sensors for your mobile learning design?

Here are a few developer resources to get you  started if you have to use the browser for your mobile design. Native applications can access the sensors through their native programming environments like iOS and Android:

accessing accelerometer in the browser:
http://www.mobilexweb.com/blog/safari-ios-accelerometer-websockets-html5

accessing the camera roll with ActionScript 3 (Adobe Flex and AIR)
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/media/CameraRoll.html

accessing the camera with other scripts:
http://code.google.com/p/iphone-photo-picker/

And technologies like PhoneGap and Titanium also provide for access to some sensors and you can build native applications using web technologies with PhoneGap and Titanium. 

Advertisements