Inside HugBag

Overview

Første skitse

Første skitse

The thought behind HugBag was to make use of the child’s strong and gross motor based actions, while exploring continuous and co-located coupling of action and effect tightly connected to the child’s gross motor activity.

The construct is made of a semi-inflated gym ball resting on a semi-circular plate base. An accelerometer mounted on the base detects the tilt direction and angle. A Microsoft® Kinect sensor, also mounted on the base, detects the location and degree of deformation while being hugged, pushed or punched. These sensors control evolving light patterns and sounds as a response to interaction via integrated speakers.

There have been different versions of HugBag over the project duration. The first versions (1.0) used a four resistive stretch sensors along the surface of the ball to detect which side was being dented. Versions 2.0 and 3.0 used Kinect’s depth sensor to detect the area of interaction.

Detta bildspel kräver JavaScript.

HugBag 1.0

HugBag Workshop

HugBag Workshop

HugBag 1.0 consisted of an inflated gymball surrounded by stretch sensors and rgb ledstrip. It used a four resistive stretch sensors along the surface of the ball to detect which side was being dented. Stretch sensors were custom-made with conductive fabric sewed on the ends and in the middle region over an elastic fabric band. Each sensor was treated as a part of a Wheatstone bridge connected to an Arduino board that interpreted the stretchness of each sensor to determine the shape of HugBag. This approach was very sensitive to the amount of inflation of the ball and the side of the ball being dented was not always interpreted correctly when the interaction was closer to the ”top” of HugBag. HugBag 1.0 was never tested in the Multi-sensory environments participating on the SID project.

Materials

  • 70cm diameter GYM ball
  • Arduino board
  • four 30cm long elastic bands
  • four 20cm long conductive fabric strips
  • three 1Kohm resistors
  • 1 Kohm potentiometer
  • Ikea RGB flexible ledstrip

Software

The programs and code for HugBag 1.0 can be found here. HugBag 1.0 runs on an ArduinoBoard and does not require any software. Except for programming and visualization of sensor signals, which require Arduino and Processing IDEs.

HugBag 2.0 (Standard) and 3.0 (Fluffy)

There are two versions of HugBag that were tried in the Multi-sensory environments at the SID project: “Standard” and “Fluffy”. Both were similar construction-wise and behavior-wise. With some differences.

Material:

  • 70cm diameter GYM ball
  • Plastic pot base
  • White, semi-transparent fabric cover
  • USB Speakers
  • USB Hub
  • MS Kinect sensor
  • Arduino computer
  • 3-axis accelerometer (only standard version)
  • Ikea RGB flexible ledstrip (only standard version)
  • Two Addressable RGB ledstrips with 150Leds each (only fluffy version)
  • Corrugated cellophane cover (only fluffy version)

Software:

The programs and code for hugbag can be found here. The requirements to run the hugbag programs are:

  • Computer running MS Windows 7
  • MS Kinect SDK
  • Ableton Live 9.0 Suite (with Max4Live)
  • Virtual MIDI cable (LoopMIDI)
  • A MIDI-serial bridge (Hairless MIDI)
  • Arduino IDE

Construction:

HugBag 2.0 consisted of a semi-cirular section of a semi-inflated gym ball resting on a semi-circular plate base. The semi-circular gymball was glued to the plate base with silicone glue. A  An accelerometer mounted on the base detects the tilt direction and angle. A Microsoft® Kinect sensor, also mounted on the base, detects the location and degree of deformation while being hugged, pushed or punched. Instead of using depth sensor information, the infrared-camera of the Kinect sensor was used to detect the interaction with HugBag’s surface. Blob-detection from the Kinect’s IR sensor controlled evolving light patterns (via Ledstrips) and sounds (via integrated usb speakers) as a response to interaction. Fluffy HugBag is covered in a corrugated structure of cellophane paper, which provides richer tactile and audible responses to touch. It has addressable lights, arranged in a spiral pattern around the whole semi-inflated ball construction, and in contrast to Standard HugBag, only the touched area lights up.

Two Max4Live device have been created, MultiCamMidiController and MoodController.

MultiCamMidiController obtains the image from the Kinect or webcam (Windows only), processes the image and outputs control parameters to be mapped to other Ableton Live devices. It also output the control parameters to MIDI controls changes, so you can use them to communicate with the Arduino using a Virtual MIDI cable and a MIDI-serial bridge.

MoodController can control HugBag’s mood and change the sound and light response using the same control parameters from the detection. It mixes different soundtracks as a response to the same interaction. MoodController also outputs MIDI messages to arduino.

The Ledstrips and the accelerometer are connected to the arduino computer, which controls the light patters as a response to the accelerometer’s data and the control parameters from sent via MIDI messages from Ableton Live.

HugBag components sketch

Left: Standard HugBag with Uniform RGB LEDstrip and accelerometer. Right: Fluffy HugBag with corrugated material and addressable Ledstrip that only lights in the area being dented.

Hug Detection

First, the Kinect sensor detects the HugBag’s inner surface when there is no interaction and stores the image as a “baseline”. Any camera of the Kinect can be used (depth, color, infrared), but for best results, the infrared camera is recommended, as it is insensitive to environment lightning and has a wider range compared to the depth sensor.

When the hugbag is being dented, the Kinect sensor detects the area being deformated (Figure 2.a) , smoothens the image to remove noise, and substracts the baseline to only process the change of the surface (Figure 2.b). If the change in the surface is “big enough”, then a blob is created in that area (Figure 2.c). Blobs are 2-dimensional elliptical shapes with a center (Cx,Cy) and an size (CA). Each blob’s coordinates are transformed to polar coordinates using the distance from the center of the coordinate system (Cr), angle from the coordinate system (Cθ) and area of the blob (CA). All detected blobs are averaged into a single blob with center and size information (r, θ, A).

Figure 2. a) Image as captured from the infra-red camera, b) processed image after smoothing and baseline removal, c) detected blobs

Figure 2. a) Image as captured from the infra-red camera, b) processed image after smoothing and baseline removal, c) detected blobs

Sound and Light

The blob information is used to output sound and light out of the hugbag. Each control parameter (r, θ, A) is used to control specific sound or light effect as shown in Table 1.

Soundtracks are described in terms of dynamism (“active”, “passive”) and timbre (“raw”, “fine”), and grouped into the location where they are produced in Hugbag (“top” or “sides”) (Figure 3). Light effects are described in terms of color (hue, saturation, value). Standard Hugbag is light up uniformly, while Fluffy HugBag is light up only where the interaction happens.

Figure 3. Soundtracks grouped into two groups (Top and Sides). Interaction with the top of HugBag will allow more volume to active and passive soundtracks, while interaction with sides of HugBag will allow more volume to raw and fine soundtracks, depending on the HugBag’s mood.

Figure 3. Soundtracks grouped into two groups (Top and Sides). Interaction with the top of HugBag will allow more volume to active and passive soundtracks, while interaction with sides of HugBag will allow more volume to raw and fine soundtracks, depending on the HugBag’s mood.

The distance from the center (r) controls the track volume mixer, meaning that blobs detected further away from the center of HugBag will produce tracks associated with the “Sides” group to have higher volume while tracks in the “Top” group will have lower volume. While blobs detected closer to the center will mix the soundtracks the opposite way. For Standard HugBag, the r controls the color saturation, therefore blobs closer to the top will be whiter than blobs closer to the sides. For Fluffy HugBag, r is part of the location where the light effect will happen.

The angle of the blobs with respect to the HugBags center (θ) will modify the sound properties of each specific track, this could be the pitch, tone, oscillation frequency, etc… depending on the soundtrack. For standard HugBag, θ controls color Hue. For Fluffy HugBag, θ is part of the location where the light effect will happen.

The size of the blobs will control the Master volume for all soundtracks. For Standard HugBag, A controls the brightness or value of the color. For Fluffy HugBag, A controls the sparse of leds that will light up around the area where the interaction is happening. The bigger the blob, the more LEDs will light up. It also controls the value of the color.

Control Sound Light (Standard) Light (Fluffy)
r Group mixing Color saturation Location center
θ (th) Sound quality Color hue Location center
A Volume Color value Location sparse, color brightness
Mood wheel Mood  (track mixing)  — Color (red,green,blue)

Table 1. Control parameters mapping to sound and light effects

Controlling HugBag’s mood

Using exactly the same control parameters (r, θ, A), the sound and light effects can be changed. This can be used to adapt to each specific child or situation. The gray dot can be moved around the mood wheel to control how the volume mixer of tracks can be adjusted (Figure 4). If positioned closer to the left area, the passive sounds will sound louder, while the top area will make finer sounds sound louder, etc. For the Standard HugBag, the mood wheel does not affect the lightning. For the Fluffy HugBag, the mood wheel changes the color’s hue and saturation of the area being dented.

Figure 4. MoodControl max4live device. Moving the gray dot over the color-wheel will change the volume mixing of each soundtrack gradually.

Figure 4. MoodControl max4live device. Moving the gray dot over the color-wheel will change the volume mixing of each soundtrack gradually.

Mapping of parameters to Live devices

Both Max4Live devices allow their control parameters to be mapped to up to four different Ableton Live knobs of different devices. You can select which of the four mappings with the “Display Mapping” on the top right corner of the device (Figure 5). The range of the signal can be adjusted by its minimum and maximum values, and it can also be inverted (0% becomes 100% and viceversa). The output signal of each parameter can be modified by changing the curve, compressor, jitter and smoothness of the signal.

  • Curve: Negative values produce an exponential curve, positive values produce a logarithmic curve.
  • Compressor: Positive values force parameter to the outer extremes, negative values toward the middle range.
  • Jitter: adds random variation to the current value
  • Smooth: Smooth value changes
Figure 5. Curve and Compressor behavior for negative, zero and positive values.

Figure 5. Curve and Compressor behavior for negative, zero and positive values.

MIDI messages to Arduino

Both Max4Live devices allow their control parameters to be mapped to one MIDI control change. Ableton talks to the loopMIDI port. (Figure 6.a) Then LoopMIDI communicates to arduino via serial port using the Hairless MIDI program (Figure 6.b). The arduino board is configured to receive the MIDI control changes shown in Table 2.

MIDI Control Change

Control Parameter (standard)

Control Parameter (Fluffy)

Ctrl. 1

R

X

Ctrl. 2

θ

Y

Ctrl. 3

A

A

Ctrl. 4

Green (mood)

Ctrl. 5

Blue (mood)

Ctrl. 6

Red (mood)

Table 2. Control parameter to MIDI control changes mapping for each type of HugBag

A quick start guide and troubleshooting information for HugBag is available here: