Monthly Archives: January 2015

Websites coming out of the woodwork

Screenshot of the portcities websiteI’ve worked at Bristol Museums for just over two years now, and still now and then I’ll be chatting to someone or receive an email saying “oh, did you know that such and such website is ours?” Which I then add to my growing list and maybe have a little grumble to myself about.

Now, on the one hand, it’s great that people are telling us about these (anyone else want to let us know of any more, please?) but on the other it creates a bit of a headache for us in keeping track of exactly what content of ours is online and how people are using it.

It’s easy to just assume that, because they’re pretty old and incredibly out of date in some (most) cases, that they’ve been forgotten about and people don’t use them. This isn’t necessarily the case, though.

One example of this is the Portcities website – http://discoveringbristol.org.uk/ – which was made around 2003. It gets a huge amount of traffic – just over 470k unique pageviews in 2014, which is coming up for nearly half of the amount we get to our main website www.bristolmuseums.org.uk (around 1m a year and growing).

I looked at the analytics for this with Jane from our Learning team recently, and there are some other interesting things that we can see:

  • There’s a dip in traffic over the summer and during school holidays, suggesting it could be being used as a learning resource
  • Most of the content looked at is about Bristol and Transatlantic Slavery
  • The main bulk of visitors (around 45%) are from the US. This is nearly twice as many visits than we get from the UK
  • 86% of visitors find the website from search

There’s clearly a purpose for this content, so we need to think carefully about what we do with it. We’re working really closely with our Learning team to try to map this out, find the opportunities and see what we can do to best serve these users.

Creating characters for our Hidden Museum app

We’ve been beavering away creating characters for our app. Creating a style for this exercised our grey matter somewhat!

We needed to create a look which complimented the very clean visual style that our lead designer for the project, Sarah Matthews created for the app. However, from our years of experience in creating characters for apps and games we know that younger children react much better to characters with an element of realism to them.

In user testing we trialed 3 potential styles:

A photographic style, using photographs of artifacts from around the museum:

Photographic character

A silhouetted style, which fits very well with the UI design, but is much less realistic in style:

Silhouette character

A geometric style, which is stylised like our UI design, but has much more realism:

Geometric character style

The overwhelming vote was for the geometric style, which thankfully our production team all liked too – so we decided to go down this route.

Next we had to decide what our characters should be – we had some ideas and Gail was great at helping us to hone these down. We wanted them to really represent the diversity of the exhibits at the museum, and also be appealing to all age groups and both males and females. So we settled on the ubiquitous dinosaur, George Peppard the boxkite pilot who flies the plane in the main hall, a chinese dragon head as represented in the chinese dragons over the stairs int he main hall, a roman goddess to represent the museum’s wealth of roman artifacts, a female Egyptian Mummy with gorgeous coloured paint and… Alfred the Gorilla (obviously!)

Here are the results!

All characters

Still a bit of work to do on the Mummy to make her a little more female, but more or less there.

bristolmuseums.org.uk – Phase Two Planning

We’re now starting work on phase two of our website, www.bristolmuseums.org.uk, as Zak has already mentioned. So here’s a bit more detail about what we’re planning, once again following the GDS phases of service design.

(Note: if you’d like to read about what we did for phase one, you’re in luck – we’ve lots of posts about it on this here blog.)

We’ll be working with the guys over at fffunction in three stages over the next three months. From an evaluation of user needs and developing on from phase one, we’re going to be focusing on things that generate revenue and make it easier for people to book with us; whether that’s improvements to the what’s on sections (which get the majority of visits), learning and venue hire.

Milestone 1 – January 2015

Updates and work carrying on from phase one on opening times, events filtering, navigation and what’s on sections.

Milestone 2 – February 2015

Workshops with the programming, learning and venue hire teams to really get to grips with what our users need from us online in these areas.

Milestone 3 – March 2015

Workshopping and implementing a ticketing solution for the above, making our online shop look a bit nicer and researching and implementing online donation functionality.

We’ll keep you posted with how it’s going and what we discover.

Testing the stories/tour – Kids in Museums User Testing Day

At it’s core, our Hidden Museum app takes users on a tour around the Bristol Museum, guiding them to places in the museum they might not ordinarily go, and revealing hidden information as they progress through the game.

To simulate this experience our assistant creative director, Rich Thorne, behaved as the app, using one of the story-tours we had created; he did this by leading the group around the museum around a range of artifacts on a theme (in this case ‘horses’) and engaging them in conversation about each artifact as they went – explaining how they were linked and telling them an interesting story about each artifact once it was found.Kids on tour questions

The aim of this was to:

  • See how long it took them to get round the journey as a group.
  • Get feedback from the children on any standout points of interest on the tour.

Key observation points for the supervisors of the tour were:

  • Is the tour engaging, interesting?
  • Is the tour too long/short?
  • Which object was the favourite on the journey?
  • Have they visited the museum before?
  • Have they ever been to the top of the museum?
  • Is it more fun having a checklist? As opposed to walking around the museum looking at everything.
  • Do they have ideas for trail themes which would they like to go on?

Kids in museums kids look at cabinet

What we found:

We discovered that the tours which we had devised contained too much identification in advance about what the group were going to see. For example, the horse trail said in advance that the group are going to find objects about horses in them. This was taking some of the excitement out of the tour. We needed to find a way to work on the themes in order to broaden them out to become more subjective and feel less curated.

We also learnt that this kind of curation meant that we were not making the most of the ‘hidden’ metaphor of our app. Whilst we were leading the group to areas they might not otherwise have gone to, it did not allow for enough free exploration of rooms and free thought around the objects themselves – getting lost in the museum and ‘accidentally’ discovering something hidden should be a desirable side effect of the app so we needed to find a way to allow for this.

The groups, particularly the kids, were very interested in collecting and counting. They were also particularly keen on emotive subjects – picking out items which were ‘weird’ or ‘strange’ or ‘scary’ or ‘cute’.

Kids in Museums – User Testing Day – Overview

On Friday 21st November, the Hidden Museum team were lucky enough to have the opportunity to be a part of a Kids in Museums take over day. Kids in Museums are an independent charity dedicated to making museums open and welcoming to all families, in particular those who haven’t visited before.

Kids in Museums Logo

This was a great opportunity for us to test our app in production with real users, over 30 kids and 6 adults who had never seen the app before! A real coup for us as developers – a chance to get some real insight into how our app might be received on completion.

However we were conscious that we did not want to take advantage of the day and it’s main aim of making museums more open to kids and families. So we took great care to plan the day with the education coordinators at the museum, Naif and Karen, to ensure that we were providing a fun experience for the kids, as well as testing elements of our app. As there were adults supervising the kids we also tested all elements with the supervising adults to get an impression of a family group’s opinion of each element (rather than kids opinions only) – very important as our app is aimed at a mixed-age group.

Kids on tour

Naif and Karen suggested that we warmed them up with a ‘fingertips explorers’ activity – where the kids felt an artifact blindfold, and had to describe it to their friends – their friend’s guessing what it could be. (A fun game which the kids really enjoyed and we have used as an influence for one of our games as a result!)

We decided that after the fingertips explorers warm up the kids would be ready for app testing!

Kids in museums kids looking at zebra

As the app was not yet complete, we decided to test elements of the app broken down, rather than the experience as a whole. We decided to break our testing down into 4 elements:

Testing the stories/tour

Testing the iBeacons/compass interface

Testing the UI

Testing the games

This decision was reached mainly through necessity since the app was not complete. However, we found that it really worked for us, and allowed us to get some really in depth insight into our app. This was for two reasons – firstly it allowed us to break the group of children down into smaller more manageable groups so we were able to have real conversations with each of the children in turn – and secondly it allowed us to assess which elements of the app they struggled with the most and so exactly where we should be making our improvements.

There were lots of great testing outcomes to the day around each of the app elements outlined above – I’ll update the blog with how we tested each of the app elements (and the associated learnings) over the coming days.

 

Starting website phase two

In 2014 we launched www.bristolmuseums.org.uk in what we imaginatively called ‘phase one’ which not only gave us a service wide presence but allowed us to:

  • Follow the GDS service delivery approach for the first time – discovery, alpha, beta to live
  • identify real user needs through a discovery phase
  • plan to make the website a platform to help us delivery services for years to come
  • share publicly through a conference, blog, workshops and other events everything about the project
  • Keep my job – i half joke that i’d have quit if I wasn’t able to get the project done

We learned lots from doing this project and have done amazingly well with our key performance indicators.

With more than six months data under our belts as well as a ratified new service structure and direction for 2015-18 we are turning to starting phase two. Fay Curtis will be leading this project between January and April 2015. Check back regularly for updates.

Moved by Conflict exhibition Character Points

Zahid Jaffer, Content Designer

Image of an ultra sonic sonsor
Ultra sonic sensor

Overview

The Moved by Conflict exhibition at M Shed is comprised of many different types of technology to interpret content, from projectors to speakers.  We used some new technology we haven’t used in the past to deliver this content, notably the RFID tag system.

We had several briefs, but the one that stands out is: visitors need to have a personalised experience through the exhibition; the ability for visitors to have content of their choice delivered to them in the exhibition through digital means. The idea was to have stories being told through video, and we worked with Bristol Old Vic to bring a more theatrical performance to these stories.  We had actors playing six fictional characters telling their stories, which would capture their lives before, throughout and the end of the First World War.

Concept 

We needed a way for visitors to trigger the content when they wanted to experience it. Initially we wanted hidden video screens (projections) around the exhibition and when a visitor walked next to it the video magically appears for them. To do this we looked into iBeacons, a Bluetooth technology which can be used to trigger an activity from a specified distance to the user, for example playing a sound when someone gets within two metres of a loudspeaker. Our concept was when someone gets to within a metre of a screen the content appears and when they leave that area the content turns off. The trigger device would be a visitor’s smartphone or a small Bluetooth transmitter/tag.

Image of a media player
Media player

After a lot of research we found that this would cost a lot of money and would take a lot of time to develop – this technology is still very new, which is why it costs quite a bit. We looked then at long-range RFID technology, but this was also outside of our budget.  We decided to go for short-range RFID, so a visitor would need to pick up an RFID wrist band and scan it in a specific location, as we were still keen on the idea of the content being triggered when you get to a certain distance.  To do this we’d need to use a sensor, which wouldn’t trigger the main content but would trigger an intermediate screen, such as an image with instructions on it informing  you what to do with the RFID wrist band.

Once we had finalised the concept we started looking into the equipment that would enable us to do what we wanted. We looked at a number of options, ultimately what we went for worked very well.  The content is displayed on a 24 inch screen, used in portrait orientation. There is an actor speaking to camera, with their head and shoulders in shot, giving the actor lifelike dimensions.  We needed something that would play the content and to be able to accept triggers so we looked in to Raspberry Pi. For what we wanted to do there would be a lot of programming and coding, and we were also not sure if the Raspberry Pi would be instant enough on the triggering as we were informed  Raspberry Pi could have a slight delay in triggering hd content.  We wanted instant triggering and relatively easy setup/programming as we were limited on time, so we went down the route of a media player.

We selected a Brightsign hd1020 media player which has GPIO and allows you to connect buttons to trigger content, and also has USB input so you can connect a keyboard to it. The programming of this media player is relatively easy to do as it has graphical programming software you load on to your pc. These three elements were what we needed to make our concept work.

photos of the Character point (left), directional speaker (middle) and inside the character point (right)
Character point (left), directional speaker (middle) and inside the character point (right)

Concept to Reality 

The GPIO is connected to an ultrasonic sensor, which sends out a high pitched audio noise (well above human hearing) and listens for the echo to return. The sensor allows you to increase or decrease the sensitivity, meaning you can set the distance of how far you want it to trigger. It also has a ‘stay open state’ and ‘stay closed state’ feature, so when a person is watching the content the sensor will stay in an open state (as it is still detecting an object in front of it) and once the person steps out of the sensor’s range it will switch to a closed state and the content will finish.

The USB port on the media player is used to connect a USB close range RFID reader. This reader detects the RFID wrist bands that visitors pick up.  We’ve also used a directional speaker to limit sound spill in the gallery and to give the visitor a more personal experience.  With all these elements combined, the way it works is;

  1. On the screen the visitor sees a static attractor image
  2. As the visitor gets closer to the screen, the motion sensor will detect them
  3. This will trigger the content on the screen to change to an image with instructions asking them to scan their RFID wrist bands on the pink square (the RFID reader is directly behind the pink square)
  4. This will trigger the content.
Photo of a Media player and audio amplifier
Media player and audio amplifier

If visitors read the instructions and decide they don’t want to view the content they can step away and the sensor will detect there is no one in front of it and switch to the attractor image. If a visitor decides to trigger the video content with the RFID wrist band and decides that they’d rather not watch any more, they can step away and the sensor will detect there is no one there, so the video will end and go back to the attractor image. In the exhibition we have six of these RFID interactives; we’ve named them Character Points.

Concept to Reality Issues

We quickly realised that there was an issue with the triggering. We found that the sensors were not staying in the open state; they would go into a closed state and open state repeatedly which meant the content wasn’t staying on the screen for long. To overcome this we bought a timed relay and wired it in to the sensor. The relay activates when the motion detector senses a person and holds the sensor in an open state – we set the time of this to 10 seconds. The relay gets activated even when it’s holding, meaning it will continuously reset the timer to 10 seconds as long as it’s detecting something. Now when a person steps away from the sensor’s range the content will stay on screen for 10 seconds then switch back to the attractor screen.

Photo of the internal components  of the character points
Internal components

Another issue we had was that some visitors decided to poke their fingers through the holes that the sensor’s microphones stick out of. These need to be visible otherwise the sensor will not work (you can see these microphones in the photo of the sensor above). The sensor would get dislodged and fall inside the character point. We tried using glue and silicone to stick these sensors to the door, but visitors still managed to push the sensor through. We found good old gaffer tape held the sensor in place and can withstand a lot of force if someone tries to push the sensor through.

Now that we have the equipment to do this kind of interactivity, we’ll be using it in other interactives. Hopefully in the future we can expand on this to make it in to a long-range RFID system.