In a very short time, Pokemon GO has pushed augmented reality (AR) into the mainstream.
Its ability to overlay digital animations onto the real world using your phone’s camera and screen is unlike any other popular game before it. But Pokemon GO isn’t the only app that seamlessly blends virtual objects into the real world. Here are the best AR apps not linked with Nintendo.
Read full story >
I tidy wee site discovered by one of our 2nd year VCD students. Some really engaging infographics done in code
View the site
An interesting open source project, The Cacophony Project will turn birdsong into data. By capturing the sound of our ecosystems, they’re going to build up a dataset that tracks exactly what the birds are telling us, over time and over the whole of New Zealand.
The Cacophonometer is a simple piece of hardware that lives in the bush, listening out for birds and automatically sending what it hears to the cloud. Every recording is tagged by GPS location and by time.
The plan is to spread Cacophonometers far and wide. And we listen, learn, and improve the Cacophony Project as we go.
Long term, these devices will also be modified to use sound to lure pests, identify, and eradicate them in an intelligent targeted manner.
There was a good interview on RNZ today: http://www.radionz.co.nz/national/programmes/afternoons/audio/201793257/the-cacophony-project-grant-ryan
For more: https://cacophony.org.nz
In the near future, multiple devices equipped with facial, vocal and biometric sensors utilizing affective computing will be competing to analyze and influence our feelings. These capabilities may simply appear via firmware upgrades in products we already use. Apple, for instance, recently purchased Emotient, one of the leading companies focused on facial recognition utilizing artificial intelligence (AI). Soon you won’t need to prompt SIRI, but simply respond when “she” says, “Your expression seems sad — should I download Trainwreck from iTunes?”
Read the full article on Mashable
“IDE” is what the students called the 3rd Year CoCA elective 222.392 Interactive Digital Experience.
In IDE students investigate how interactive technology can be used to enhance a location-specific visitor experience. Last year we worked with Te Papa as a ‘live client’ and hope to do that again this time.
The paper has two briefs. The first stage is to design and visualise a personalised mobile guide for the institution. In this case the digital interface with the space travels with the user.
Stage two extends this experience to a detailed interpretive media for one display item – so now it is the space that provides the interface.
Students are encouraged to explore emerging technologies with an emphasis on UX and innovation and can use mobile devices, haptic, motion sensing, audio input or output to create a future focused solution.
This paper follows on very well from 197.238 Interaction and Interface and 197.379 Experience Design. It also extends learnings from the core options of 222.357 web and app.
Here are some snapshots from the 2015 student work:
The team at uxdesign.cc has seen a lot this year: 48 issues published, 384 links curated and sent to 61,295 designers around the world every week via email. Enough content to help the team at uxdesign.cc start identifying patterns and trends across what’s being written and published in the amazing world of User Experience Design.