Axel Fougues's profileDouglas Benhamou's profile

BitSense: Biohacking a sixth digital sense.

All the public research here: Subdermal-Magnetic-Implants-RnD

Enlarge your senses and touch the data with BitSense.
"The BitSense project is a work in progress, a passion project, an infinite rabbit hole."
This is a personal project I worked on for about a year. It lead to me working at the IT lab LIRMM (CNRS Montpellier) with researcher A. Kheddar on a six month study of Haptic rendering using magnetic implants. For the latest project see SMIS: Biohacking and Haptics.
This is how it all started:
Premise:
Subdermal implants are becoming a common body augmentation in the biohacking community. I personally have 3 N52 Neodymium magnets in my right hand among other implants. These allow me to feel magnetic fields from a short distance. The sensation is provided by the magnets slightly vibrating against the nerve endings.
Realizing that being able to wirelessly stimulate this “6th sense” though induction is a step towards human and machine interfacing made me wonder what the limits of this interface are and how electronic sensors could be used as more indirect senses.
General idea:
A compact hand device (glove or bracelet) that could take input from a sensor or device and vibrate subdermal magnets using induction coils to stimulate the user’s nervous system.
The flux of data is then processed by the brain and with time and thanks to brain plasticity it ends up becoming an additional sense (like touch or sight) without depending on any of those. 

But why? Well suppose that you attach a ultrasound sensor to your BitSense, bam! you now have a sixth sense, you can hear ultrasounds! Same goes for any kind of sensor... 
Now imagine connecting it to your phone and sending artificial signals... no, not just your Facebook notifications, but an entire virtual world. This is my ultimate goal:​​​​​​​
Early prototype in ring format, USB powered, audio jack interface, mono.
BitSense Toolkit:
I developped a couple Android apps to accompany the device. One of them is a testing toolkit allowing me to experiment further without the need for lab equipment. This way I can determine the limitations of both the Android system and the early BitSense prototypes.
Hopefully this app will become a user-friendly platform to experiment and link a BitSense to Android-generated data (It became the base idea for the SMIS app)
The BitSense Toolkit, work in progress, currently has a functional signal generator:
And a spherical input-output tester. This tool uses the android gyroscope to determine the direction in which the user is pointing (it involves strapping the phone to the arm, BitSense might have an gyroscope included in the future...). This direction is then projected on a spherical map and the obtained information is processed and sent as feedback through BitSense. This effectively allows to explore a virtual map around him. Here I use the blue channel of the texture as a map, in the future I will be able to use the other channels for superposing multiple maps in a single image. These will be mapped to multiple frequencies (assuming the brain is capable of understanding multiple frequency variations at once...):
For now a stereo and a mono mode are implemented. On this real-time visualisation you can see the projection on both the spherical maps and on the unwrapped flat map (Red dot is for right, blue dot for left and green for mono). The stereo mode allows me to stimulate two different implants on two different fingers with different data. This way I can give directional information rather than just intensity (exactly the same way stereo audio works). I made the spacing between left and right adjustable:
On this colorful tiled map the projection from sphere to flat map is clearer. This means I can store my maps as simple unity textures. The only thing is I have to keep in mind the distortion when creating them. Here are some testing maps:

Time Paradox: a BitSense based game to test the extent of neural plasticity:
Time paradox is an android + BitSense based game developed in my university's Serious Games course. It is a two person project with Douglas Benhamou who you can also find on Behance.
We initially intended to track the player's hand position in 3D space so that the BitSense feed-back could be used to explore a completely virtual world made of "fields" and "signals" rather than conventional geometry.
Sadly the sensors in an android device (gyroscope/accelerometer/magnetometer) are not sufficient for accurate 3D spatial tracking. And GPS positioning is far from the resolution scale we wanted. Any other technology on the market makes use of visual beacons or has terrible accuracy. (If you have any solution to this PLEASE contact me). This forced us to go with the gyroscope based spherical projection explained below.
How the game is played
The player extends their arm and points around themselves. This can be seen as touching the inside face of a fictional sphere that englobes the player.
We can imagine the game level as a simple maze being projected on the inside of this sphere. The player navigates it by moving his arm around that sphere and getting feedback through BitSense.
Note that this sphere is purely a representation, in reality it is the player’s hand orientation that is decoded to provide information about the position of the player on a flat projection of the map.​​​​​​​
At the end of each maze is a collectible reward that the player can later see among his collection on his phone. (I intend to reuse and build on this concept in the SMIS app)
More than geometry
A level can be seen as the superposition of vector fields, the obstacles being repulsive and the objectives being attractive. The player essentially does the job of a path following agent on this "map".
Illustration of a vector field in 3D
The feedback
As we are using a completely new technology so the feedback system has to be completely re-thought. We are able to send variable signals through a multi-channel induction interface to the player's hand and therefore keep him aware of his virtual surroundings through a set of artificial stimuli.
Spatial awareness signals will be constantly produced. They will give directional information in the background allowing smooth navigation. These signals are comparable to the local information of a vector field combined with the player's orientation and depending on his aiming position.
Information about surrounding objects will be transmitted on different channels or if necessary overlaid on the continuous background information. These signals will be object specific and their intensity will depend on proximity.
We have a huge range of possibilities when it comes to playing with amplitude, frequency, modulation and signal type.
As this is a highly experimental interface and a proof of concept, we do not know how long the learning phase will be before the brain begins to interpret the feedback efficiently and intuitively.
BitSense: Biohacking a sixth digital sense.
Published: