Merging the possibilities of spatial computing, machine learning and art direction, we explored the potential of an intuitive design tool in virtual reality.
What we did
- Machine learning data training
- Spatial interactions and environment design
- Machine-human collaboration
Machine learning, virtual reality, and intuitive design
Techno Carpenter is a technical prototype in virtual reality (VR) that proposes an intuitive collaboration between humans and machines: a virtual environment that invites you to shape your very own machine learning-generated chair by performing movements with your hands.
Machine learning is a highly specialised field; it takes experts years to learn how to utilise the technology in a way that creates meaningful output for humans. The mathematical language that defines its inner workings has created a high barrier for most people to play around with its potential use cases. This gap ultimately hinders us to think creatively on how to apply this emerging technology outside of contained fields, like detecting cars or recommending products.
Techno Carpenter aims to demystify algorithmic decision-making processes by providing a virtual reality interface that leverages one of the most intuitive forms of human communication: the movement of hands.
Interacting with algorithms without using a keyboard
While building Techno Carpenter, we aimed to lower the barrier for interacting with highly complex technologies by designing a simple experience that anybody can use without too much explanation. In the VR environment, you can intuitively use your hands to shape your own dream chair or simply follow you curiosity and let the computer generate a model as you go.
Every slight movement or rotation of palms, taking all ten fingers and 28 finger joints into account, affect the shape of the chair you’re designing. This constant back-and-forth generates an enormous amount of possible permutations of parameters and corresponding responses.
We created a speculative design tool which leverages machine learning algorithms to create a unique granularity of artificial interaction, unparalleled within conventional UI applications.
An artificial co-creator that pushes your ideas further
Feeding an algorithm more than 6.000 3D models of existing chairs, we trained it to recognise, read and analyse a chair based on its general shape and features. In order to make sense of these features, the computer positions all chairs in an enormous virtual space, where the chairs are mapped out according to their physical characteristics. It then corresponds your gesture with certain coordinates in the latent chair-space, and suggests a new model.
The output of the algorithm is generated as a 3D model in the VR space, where it responds to your input. Techno Carpenter is radically different from mainstream design tools. There are no buttons and no sliders, no knobs and no levers. Techno Carpenter assists you or suggests alternative directions to your prompts, much like a human co-creator would push an idea further.
Techno Carpenter corresponds your gestures with certain coordinates in the latent chair-space, and then generates a new model based on its continuous learnings.
Working towards AI-driven spatial design tools
When SPACE10 and IKEA approached us to think about the future of life at home in relation to technology, we saw the opportunity to reimagine how the use of spatial computing and machine learning could unlock new creative processes. Can we train an algorithm to learn from human designs, while proposing its own designs as a response to intuitive and playful human input?
Techno Carpenter challenges us to think about a new design approach for co-creation — both for spatial computing as a discipline, and for digital design as an industry. Inviting (both professional and non-professional) designers to freely express themselves, while using the trained algorithm to technically explore novel design methodologies, it promises a democratic and sustainable design process for everyday objects and spaces that more people can tap into.