Dr Benjamin Long presented a paper on creating 3D haptic shapes that can be felt in mid-air today at SIGGRAPH Asia. The paper will be published in ACM Transactions on Graphics.
The method uses ultrasound, which is focussed onto hands above the device and that can be felt. By focussing complex patterns of ultrasound, the air disturbances can be seen as floating 3D shapes. Visually, the ultrasound patterns have been demonstrated by directing the device at a thin layer of oil so that the depressions in the surface can be seen as spots when lit by a lamp.
Dr Ben Long, Research Assistant from the Bristol Interaction and Graphics (BIG) group in the Department of Computer Science, said: “Touchable holograms, immersive virtual reality that you can feel and complex touchable controls in free space, are all possible ways of using this system.
“In the future, people could feel holograms of objects that would not otherwise be touchable, such as feeling the differences between materials in a CT scan or understanding the shapes of artefacts in a museum.”
Youtube: Through the combining glass
Diego Martinez Plasencia, Florent Berhaut and Sriram Subramanian will present their paper on how semi transparent mirrors blend together the spaces in front and behind them. The paper investigates this further and highlights a whole new range of interactive experiences enabled by it.
In a museum, people in front of a cabinet would see the reflection of their fingers inside the cabinet overlapping the exact same point behind the glass. By directly pointing at the exhibit with their reflection, instead of pointing at them through the glass, people could easily discuss the features of the exhibits with other visitors. Pop-up windows can also show additional information about the pieces being touched.
Combining this approach with different display technologies offers interesting possibilities for interaction systems. By placing a projector on top of the cabinet, fingertips could work as little lamps to illuminate and explore dark and sensitive objects. When a hands reflection cuts through the object, the projections on visitorsí hands could be used to reveal the inside of the object, which would be visible to any user.
We also demonstrated artistic installations that combine this approach with volumetric displays. Musiciansí record loops in their digital mixers and these appear as floating above the digital mixer. Musicians could then grab these representations, to play them or tweak them with different musical effects.
Daniel Schien and Christopher Weeks were part of a group awarded joint first place at the recent Green Hackathon held as part of the ICT4S (ICT for Sustainability) conference in Stockholm. Spending a day underground at a dismantled nuclear reactor at KTH, the teams competed to develop a project around the theme of “food”. Britons throw away the equivalent of 6 meals a week, leading to over 7.2 million tonnes of household food waste a year.
The winning project was “Eat Exchange”, an app to allow people to share their not-quite-past-date food with others. Just about to go on holiday but have a nearly full container of milk in the fridge? Or maybe you stocked up on a 2 for 1 offering last week, but it’s now about to go out of date? The app allows you offer the item to a network of trusted friends, family, and neighbours, and get text notifications in return when something is being offered.
Although currently in the design phase, watch this space – perhaps a fully functioning prototype will make its way to ICT4S 2015!
Congratulations to Hannah Limerick whose first paper as a PhD student has been accepted for publication in the journal Frontiers of Human Neuroscience:
- Limerick, H., Coyle, D. & Moore, J.W. (2014). The Experience of Agency in Human-Computer Interactions: A Review. Front. Hum. Neurosci. 8:643. doi: 10.3389/fnhum.2014.00643.
ISMB 2014 has just announced the winner of The Best Artwork Award of this year goes to the ‘supraHex’ by Dr. Hai Fang and Prof. Julian Gough from Department of Computer Science, University of Bristol. Full details of the winner are available here.
The Intelligent Systems for Molecular Biology (ISMB) is the world’s largest bioinformatics/computational biology conference. ISMB 2014 was held on Boston, attracting top computational biology researchers from around the world. As part of this annual conference, the Art & Science Exhibition displays images and videos (called ‘artworks’) that are supposed to be results of creative efforts that involve scientific concepts or tools. This exhibition aims to open our eyes and minds, both scientifically and aesthetically.
Based on real-world genome-wide expression data, the artwork ‘supraHex’ is automatically produced by an open-source R/Bioconductor package under the same name. This artwork is inspired by the prevalence of natural objects such as a honeycomb or at Giant’s Causeway, also capturing mechanistic nature of these objects: formation probably in a self-organising manner.
Apart from the artwork itself, the package can do more, outlined as follows: i) the supra-hexagonal map trained via a self-organising learning algorithm; ii) visualisations at and across nodes of the map; iii) partitioning of the map into gene meta-clusters; iv) sample correlation on 2D sample landscape; and
v) overlaying additional data onto the trained map for exploring relationships between input and additional data. It is freely available at http://supfam.org/supraHex.