Category Archives: AV installation

The Butterfly Effect Part 1 – An Interactive Projection of Lepidoptera with Accession Data Input… easy right?

In July 2022 at M Shed, we launched our exhibition ‘Think Global: Act Bristol’. It’s an exhibition that informs the public on Climate Change as a global issue, whilst showing how Bristol can and how Bristol is acting to fight climate change. An important topic that reaches through various aspects of society, including nature.

This Interactive was thought up to be displayed in the ‘Nature’ section of this exhibition. Its purpose? To allow the public to accession our collection of Lepidoptera. This is done by entering the data shown in the photographs of our Lepidoptera, these Lepidoptera are photographed with their original handwritten accession data in shot. This data input is done through a web form on a computer setup in gallery, which is accompanied by an interactive projection wall.

The interactive wall element is to give people a fun interactive experience in gallery by moving the Lepidoptera with their movement in front of the wall. As well as this, the wall plays animations after an accession entry has been submitted and the animation is based on the data entered by a member of the public. There are 3 animations that can be displayed, one for each classification of our Lepidoptera: butterflies, moths and extinct species.  

How it Works

The interactive has a keyboard, mouse, screen, projector and camera. These are used to carry out the two functions of the interactive, accession data entry and the interactive wall. The form function is there to enable people to transcribe accession data from photos of our Lepidoptera with their paper accession data. An example of one of these images is shown below.

an image of an ‘Celastrina argiolus’ with it’s accession data.

The form has the necessary fields with validation measures where necessary to ensure that the data entered is of use. The fields are as follows:

  1. ID Letters
  2. ID Number
  3. Species Name
  4. Collectors Name
  5. Sighting Day
  6. Sighting Month
  7. Sighting Year
  8. Location
  9. Other Number
Data entry page with data entry points listed and a photo for transcription

All of these fields have validation that restricts what data can be entered and some of them (Species Name, Collectors Name, Location) have an autocorrect feature. This kicks in after 4 correct characters in a row that correspond exactly to one of the possible entries for that field. This helps the public get the spelling correct and speeds up the process of entering in data. Having the autocorrect come up after 4 correct characters also deters spam data entries, at the member of the public can only submit an entry if it passes all 4 required validation points.

Screenshot of a data entry point showing an autofill suggestion for a species that could be entered.

Once the data is entered correctly and submit is pressed a loading screen will appear, this loading screen will stay on until an animation corresponding with the type of Lepidoptera is shown on the interactive wall.  

This interactive wall uses an ultra short throw projector to front project Lepidoptera onto a wall in gallery. The nature of this projector means that it is hard for people to cast shadows on the wall as the projector is mounted very close to the wall. As we were not able to rear project, this is the next best setup for this projection that also achieves an image over 3 and a half metres wide, which gives a good area for interaction.

There is a Kinect Azure mounted away from the wall which gets a depth image of everything in shot. This depth image is used to detect motion in front of the wall which in turn is used to affect butterflies in the area around where the motion is made. More Lepidoptera build up on the projection every time an entry is made in a day.

How it Works: The Nerd Version

The interactive runs on two systems with one system referencing the other. The data entry system is a Python Flask Server, which runs on Apache and can be run on a Windows PC or a Linux server. Though the server version I am yet to run in gallery due to some compatibility improvements and an inability to sort terms and conditions for this exhibition as of typing.

The server serves the client the data entry form with a randomly chosen image for transcription alongside it, the data inputed for each entry is saved to a timestamped json file. This file contains all the data fields as well as the filename for the image, meaning that all the data can be linked and sorted through afterwards in order to upload to our database. The server also updates a file that shows the latest species that has been entered, this is used by the Interactive Wall’s system to trigger animations.

The interactive wall runs on a Touchdesigner project that I created that uses a Kinect Azure to see people and know where to apply movement to the Lepidoptera in the projection. Touchdesigner is a real time visual development platform that allows you to create interactive installations, it’s a node based programming environment that allows interactives like this to be created in good time. Touchdesigner uses a particle system(particleGPU) that uses 3 videos, one for butterflies, moths and extinct species. These videos are then put on 2d planes that move and rotate in 3d space, these are the ‘particles’. These particles are affected by optical flow, which is generated by Touchdesigner analysing motion in the depth image, areas in which it believes there are motion are then used on the particleGPU video to move the particles in the affected areas.



For the entry animations that play when an entry is made by the public there are 3 videos that play, again one for butterflies, moths and extinct species. Touchdesigner overlays these videos onto the particleGPU output when the Flask Server signals it’s had a new entry, Touchdesigner will then check which animation should be played to make sure it corresponds with the relevant Lepidoptera. This process works however it is not instantaneous and It’s one of the elements of this interactive I wish to improve for future use.  

What’s next?

As of typing, the exhibition is yet to finish, I am hoping to add some improvements to the interactive before it’s derigged as having it in gallery would be a good test bench to make solid changes. These changes include:

  • Reworked css to improve compatibility on smartphones
  • Have the linux version up and running on our server so the public can enter data on their devices
  • Decrease the latency between both systems by taking a different approach for their communication
  • Add analytics to the Touchdesigner project so we can gather data

As of typing we have over 1500 entries from the public, which should enable us to have hundreds of these Lepidoptera catalogued, which is fantastic news for us! I think this interactive has big potential for other museums and I’m hoping that I can provide versions of this to other sites in future.  

Currently it’s planned that this interactive will be returning in a permanent installation, so I plan to add these additional changes for this. I will post a second blog on labs once I’ve done some upgrades and analysed the data we have gathered from this exhibition.

Special thanks to Bristol Museums Development Trust and the ‘Think Global: Act Bristol’ exhibition for making this all possible.

How to get rid of VGA after 30 years!

Here at the M Shed in Bristol, we have amazing views of the harbor from our lovely events suit. Here we hold all sorts of events from large annual AGMs for corporations’, to weddings and some really great community events.

 

We have a fully automated integrated audio visual system. With AMX and Creston control systems, you can walk around the function rooms holding a smart, touch screen control panel and control just about everything! You can power up the projectors, lower the screens, open and shut the blinds, control volumes, select what to display from Sky TV, Blu-ray players and laptops, you can even change the lighting to any colour scheme you want.

 

It’s all pretty smart. Pretty smart apart from the dreaded Video Graphics Array as the main interface, more commonly referred to as the VGA connector! For all this advanced technology, presenters still have to connect their devices with a cable.
The VGA standard was invented in 1987 by IBM, and its dreaded 15 pin D Sub connector still to this day refuses to go away.
Until now…

 
There’s something amiss when a presenter asks to use their nice, brand new iPad to run their presentation and you then have to use a lighting port to VGA adapter connected to 10 meter VGA cable. These VGA connectors were designed for permanent installation and so when they are swapped between laptop and other devices several times a day, the 15 tiny pins take a battering and it only takes one bent pin for the screen to go pink, blue or stop all together.

Here comes the ingenious solution to take advantage of the wireless / Wi-Fi capabilities that are now standard for all devices.

The idea and solution comes in the form of finding a combination of ready available, off the shelf technology combined in such a way it allows the transmission of a device’s screen to appear on our projection system, without any wires. We needed this to be augmented into our current system without affecting its current capabilities. It is already a great intergraded AV system, it’s just needs to be brought into the future without losing its ability to use the old VGA system. It may be old but it works so well as a last resort and backup.

Apple products long ago ditched the VGA system in favour of min-display ports or “lighting ports”. A quick trip to any Apple store and an assistant will enthusiastically show how with a flick of the devices, a display can be “thrown” to another screen. It’s called Air Play and is Apple’s secure version of Wi-Fi streaming.

Google, with their ever innovative developments, have developed a technology called Chrome Cast to the same effect, which is also based on Wi-Fi streaming.

With delegates at our events bringing Apple products, PCs and android devices, we needed an all in one system; so purchased these products to enable this streaming. I ordered an Apple TV and a Chrome Cast device which both work by connecting to a Wi-Fi network and looking for compatible devices. Both of these provide a solution for all devices. Chrome Cast is much cheaper than Apple TV and can support Apple products too, but the ease of use and reliability of Apple on Apple products seemed worth the extra investment. I calculated the cost of replacement VGA cables and at the current rate we replace them, these new items would pay for themselves in just three years!

The main issue I faced in integrating these was how to patch them into a fully automated, closed AV system without affecting its capabilities. In essence, how to “retrofit” an Apple TV and Chrome Cast and get the systems to talk over M Shed’s Wi-Fi – a public network, effectively part of the councils IT network and heavily locked down.
To solve the first issue, I had to literally climb into the AV racking system to find a suitable part that interfaced with an HDMI connector (both Chrome Cast and Apple TV use HDMI). I chose our SKY TV box and unplugged its HDMI cable. Onto this cable I place a HDMI switcher, which allows 4 inputs to connect as one. The switcher is the sort of device you would buy if your TV at home only has one HDMI port and you had multiple devices you wanted to connect: a DVD player, games console and a Freeview box. I then connected the Sky box to the switcher along with the Apple TV and Chrome Cast unit. Then after finding power outlets, whilst still inside the AV systems rack, I carefully slid the switcher unit so its control switch faced out the front of the rack. A few cables ties and some Velcro later and the hardware was installed, all that was left to do was to climb out and check it all worked.

Going back to the Creston AV touch panel, I selected Sky TV and sure enough it appeared on the projections screen as it should. Then by using the controls on the switcher unit I was able to toggle between Sky, Apple TV and Chrome Cast.
It then occurred to me that both the Apple and Chrome devices use the HDMI to output their audio too. However the HDMI feeds to the projector which only projects the image, so audio would be lost. Climbing back into the AV rack, I noticed that the Sky box was using analog RCA connectors to output its audio to integrated ceiling speaker system. Fortunately the switcher also had 3.5mm TRS output (headphone socket), so by setting the Sky box to output audio through its HMDI it meant that all three devices were now feeding the audio and visual signal to the switcher. Then by using the RCA connector from the Sky box with the TRS adapter, all three devices were now feeding to the ceiling speaker system. I climbed back out of the rack and started to create a new, independent Wi-Fi network for devices to communicate.

 

 

 

 

 

 

 

 

 

 

The new Wi-Fi network was actually the simpler part.
I purchased an ASUS RT-AC3200 Tri-Band Giga-Bit Wi-Fi router. This router is enormous with six aerials and looks like the Batmobile. I figured that it would have to be reliable and be able to cope with large amount of data traffic, so I got the most powerful but cost effective router I could find.

The idea behind the router was to have all the devices (Apple TV, Chrome Cast and whichever device is streaming) all on the same network, a network I could manage. Once on the same network, it was a matter of connecting. The Apple system was really straight forward- you join the same Wi-Fi network as the Apple TV (I named the network “presentations”) then chose the Airplay option on the device and as easy as that the screen is mirrored on the projector. The Chrome set up was a little more involved. With an android device, you have to install an app called Chrome Cast. Once installed it’s quite straight forward to pair with the Chrome Cast receiver and then the screen can be mirrored on the projector. With a Windows PC laptop, I had to install the latest version of Chrome. This then comes with the option to cast either just the browser tab you’re using or the whole desktop -this works well but compared to the Apple TV there is a slight lag. In some instances you would have to install the Chrome Cast extension for Chrome.

I also connected the Wi-Fi router to our open Wi-Fi system with a RJ45 cable. This then allowed people on the Presentation Wi-Fi to still be able to access the net.
We are still trialing the system before we start to officially offer it as part of a package, but so far so good. It has been received very positively from users. We’ve had people walking around with iPads – controlling their presentation and not being tied to the lectern with an old pc. We’ve even had the best man at a wedding wirelessly control the music playlist from his iPhone at the top table! PCs are still being used at the lectern as normal but without the need to trail VGA cable everywhere. The only thing left to work out is wireless power… I suppose batteries will have to do for now.

How to make two 120FT cranes talk to each other

Here at M Shed Bristol, we have some great working exhibits from the bygone era of Bristol Harbour’s industrial past: steam engines, steam boats, steam cranes and more. But the most recognisable and iconic are the four great towering electric cranes standing over 120 feet above the old docks.

As the Industrial Museum was being transformed into the present day M Shed Museum two of the cranes would strike up conversations with each other, entertaining and informing passers-by of what they could look forward to seeing inside the new museum. However due to renovations and movement of the
cranes they fell silent again…

A few years later, due to popular demand I was tasked with bringing the cranes back to life!

To get these cranes talking was going to require rebuilding the whole audio and lighting system and recording new scripts. We were fortunate enough to have Alex Rankin, from our M Shed team, lend his penning abilities for the new scripts and Jacqui and Heather to voice the new crane characters.

To record the dialogue, we arranged to meet in a nice quite corner of the L Shed store room. It’s a vast store, full of so many objects that there isn’t enough space to have them on permanent display. With both Jacqui and Heather sat at opposite ends of a table, I set up a pair of good quality condenser microphones. Each plugged into their own separate channel on my external sound card, an Akai EIE 4 channel usb sound card with great preamps and phantom powered for the mics. This in turn was hooked up to my MacBook and copy of Logic Pro. I recorded through each script a few times and was able to compile a seamless recording from the various takes. Once finished, I hard panned each channel left and right so that when each voice played back each would have its own speaker, left or right – crane 1 or crane 2.

To start building the new AV system, I searched around the vast L-Shed stores and work rooms to find what was left of the old system. I then decided what could be re used and what new equipment would be needed. I had been informed, by our volunteer team for the working exhibits, that everything had been removed from the cranes themselves; this meant starting from scratch.

The cranes themselves would need a loud speaker system for the voices and the crane cabs would need different coloured lights to flash in time with the talking as this helps to animate the cranes. That part was relatively easy. It meant scaling the cranes and bolting speakers to their underside and mounting lamps inside the cabs. I’ll be honest, I was helped by the Volunteer team and a huge mobile diesel powered cherry picker!

 

The hard part was how to feed the power and audio cables to the cranes. After some investigation it turned out that below the surface of the dockside was a network of underground pipes which lead to the base of each crane to feed their power. The great volunteer team once again worked miracles and fed over 600 combined meters of audio and lighting cables for me. This all led back to the clean room in their ground floor workshop. With all the cabling done I just needed to build a lighting control and audio playback system.

 

 

My design solution, using what kit I could find and a few new bits, was to use a solid state compact flash media player, graphic equaliser, audio mixing desk and power amplifier for the audio.  To have the light flash in time with the dialogue, I used a two light controller with a light to sound module, similar to what a DJ might use to have their disco lights flash to the music!

By having the audio go through the mixing desk, I was able to take an audio feed for each channel and direct them to lighting controllers. By recording the two voices in stereo, with each voice on its own left or right channel, it meant i only needed one media player and could easily control each channel on the sound desk. The graphic equaliser allowed me to tweak the speakers to acoustically fit their environment.

I looked at randomising the audio or having it triggered by people walking past, but with the amount of people who pass outside M Shed the cranes would be chatting away, non-stop all day! I decided to create a long audio file of about 3 hours with the different recorded scripts and random intervals of silence. These ranged from 5 minutes to 20 minutes, so it always comes as a surprise when they start talking to each other.

The results are really effective. It is always fun to see people being caught by surprise as the cranes light up and start a conversation and to see them stop and listen in on what they have to say.