I have a new post up at ALA Techsource – an interview with the National Library of Australia’s Paul Hagon:
Paul discusses his take on making library collections available in some very unique ways:
It’s been less than 2 years since the iPhone (via the appstore) became a viable interface. We now have the iPad. Internet enabled TV’s are just starting to appear. We are interacting with these using gestures rather than through a textual interface. Imagine if your TV had gesture recognition & you interacted with it by waving your arms about, smiling for yes and frowning for no. How could we be accessing our collections using these methods?
I think that recently released devices like the iPad have the potential to become the modern day coffee table book. How easy would it be to build a ‘dynamic coffee table book’ for this device that showcased our collections (particularly thinking images) that were displayed based upon some external influences like the news or weather and you swiped to move between photos & rotated the device to expose the text based traditional metadata behind the image. How much more engaging is that than clicking on a few underlined links?