Category Archives: The Hidden Museum

This project will and test a museum multitool app that makes family and group visits to Bristol museums more fun and playful. The app will promote group interaction directly with the museum, its displays and hidden treasures. The focus is on improving visitor experience in museums and galleries and many of these cite engagement with families as a key goal. New location aware digital technologies such as iBeacons can be used to explore, improve and promote more effective visitor engagement and to encourage higher levels of intergenerational or group activity and learning.

4.2 Alex and Fay test the app

We (Alex – volunteer co-ordinator  and Fay – User researcher) spent a lovely half an hour wandering round the museum with the app, discovering some things that we hadn’t discovered before – we thought it was very fun and simple to use.

We got a little confused at first as it asked how many players there were and took our ages – we thought this meant we might be playing against each other (and got ready for a bit of competition!). The only game that did ask us to play together was in the geology gallery, where it asked us to find something broken. We were a bit confused by this, what with it mostly being natural objects like rocks and fossils. We later realised that there was a broken bone that we could have chosen but this did require some creative thinking! We also weren’t too sure what the numbers next to the game themes were – were these our scores? We worked out they were the amount of games that we had completed. Should they be reset each time someone new starts playing?

At one point we were sent along to the Curiosity gallery, where we were asked to find an object – a broken pot – which we couldn’t find anywhere! Turns out it was in the under 7s area (where you have to take your shoes off). We wondered why it might take us into there being two adults… By this point we’d run out of time and the app assumed we had found it. At the end of the countdown there was nowhere to say that we hadn’t found the object and I ended up having to take a picture of something random to move it along.

We really loved the things we unlocked for completing challenges, but they didn’t seem to be relevant to the areas that we were in. The first one we did gave us some lovely info about the RWA after we’d taken a picture of the ichthyosaur, and one in the geology gallery told us about objects we assumed must be in the Egypt gallery – we thought it might be nice to have been given snippets of info about objects in the galleries we were in, so we could go and have a look straight away before moving on to the next game.

We really enjoyed moving round the museum with the map and loved how it alerted you about where you were – that worked really well.

Supporting evidence for milestone 4.2 – informal user testing

4.2 Testing the Hidden Museum app by Mark Pajak

I’m Mark Pajak, a documentation officer for the Bristol Culture service. I have just tested the hidden museum app before starting work today. This is my first experience with the app so its all new and I have no preconceptions to cloud my first impressions of it.


A simple and colourful ‘oversized’ design was very easy to navigate with big buttons.


I didn’t read any instructions except for those written inside each button, so following the steps the app wanted me to take was straightforward. In some cases real life got in the way of my game play, such as an impromptu meeting, but I can’t fault the app for not knowing the museum was closed before 10 so the upper gallery was locked – or can I??


Yes, as a VERY regular museum visitor I am fairly locked into a routine so anything out of the ordinary is novel, and there are still many galleries I rarely visit – so a random object hunt was fun, and cut through the usual formalities of gallery interpretation & object arrangement to surprise me. not just with an object, but with new information about something I would normally not stop to look at.


There was a lag on the scrolling when picking an avatar, other than that I didn’t detect anything the app looked like it wasn’t supposed to be doing.

Other stuff

It took a while to realise the app could tell which direction I was pointing in, though with hindsight my iPhone can do that so that’s just what they do,  which led me to consider how and why it might use that information,  and it gave a certain ‘big brother’ feeling, but doesn’t everything these days? Also I have a slight aversion to taking photos with an ipad, but that’s just me :).


I could imagine someone wanting to chose a different object just because they arent fussed about climbing many stairs, but I guess that’s where kids come in – the challenge of winning the game is probably enough to get feet moving.


Simple, quick, attractive and fun – which is impressive and means there are some clever things going on ‘behind the scenes’, or at least that’s my preconception.

Supporting evidence for milestone 4.2 – informal user testing



4.2 Testing by a Budding Volunteer

Today I tested the iBeacons Hidden Museums app for the first time. I enjoyed the sense of exploration and involvement it brought. I had to use the map of the museum to guide me to my destination. The app let out a satisfying “ping” when I reached my destination, and I found there to be no problems with the iBeacons. I was then tasked to find an object after I had been given a short amount of time to memorise it; encouraging me look through all the works on display as I searched for the elusive object. Upon finding the object I had the opportunity to take a picture of it, a feature I enjoyed as it would serve as a personal memory of the object.

I thoroughly enjoyed my time with the Hidden Museums app, and did not come across any glitches. An idea for improvement could be to, on the navigation page, have separate boxes which set out which floor and which section so the user can clearly see where they must go.

Joel Grimmer, secondary school student

Supporting evidence for milestone 4.2 – informal user testing

Hidden museum ‘sprints 4-6’ user testing

Our project is working rapidly in a series of two week ‘sprints’ which i’ve written about previously in ‘Working in an agile manner’. I thought i’d bundle together a few of these sprints so folks can see what we have been up to and they all related to the technology phase of the project from my point of view.

User testing (informal)


After our initial testing with the museum take over day it was an exciting morning on the 5th Feb for me and Gail as we were unleashed on the first properly usable ‘beta’ prototype. Laura and Jake took notes. We headed out onto the first floor balcony gallery, as shown in the above photo and fired up the app.

  1. Would the app know where we were? YES it did! After choosing our characters we were instructed by the game to head for the Egypt Gallery on the ground floor
  2. Immediately upon seeing the map I tried to press the ‘ok i got it’ button which didn’t work. I needed to press closer to the map, valuable feedback noted by Jake who said “mmmm interesting we didn’t ever do that in our testing”
  3. When we got to the gallery the app kicked back into life and told us that we’d arrived in the right location – all thanks to the ibeacon technology. We got to play our first game of trying to spot the broken object. We had one minute to dash around the gallery and locate the broken object. I found it or so i thought. It turns out we have several broken nosed head objects but in my book we won that task. I really like that the app is almost a guide but disappears during the actual task so we could enjoy the gallery.
  4. Upon completion of the game we were ready for our next challenge. Off to the Birds and Mammals gallery on the first floor using the wayfinding feature of the app which seemed to work but then drop out (noted by the ever watchful eyes of Jake and Laura). When we arrived at the gallery it was mostly under wraps due to a gallery refurbishment. Luckily our ibeacon remained safely tucked away on a high level pillar PHEW. I took the liberty of jumping over the barriers to ensure the app at least knew we’d made to the gallery. At this point it crashed for reasons i’ll leave to aardman to figure out.
  5. After the app was restarted we got sent to the second floor to play two more of the challenges.

My first thoughts are that i’m very confident that the use of sensors, particular the location aware type, are going to be critical to the service in the years to come. The ibeacon technology clearly works. Laura and Jake have just written about the details of the ibeacons themselves and the hurdles that needed to overcome.

Using the app for the first time was genuinely exciting and despite some small issues aardman have pulled magic out of the bag for the games, the visual look and the user experience.


Although it is very tempting to test the app with the public I still feel we have 1-2 major bugs that we need to stomp before handing over to the general visitor. I think if great storytelling folks like aardman can master the opportunities of this type of sensor we’re in for some transformational ways of engagement. Onwards.

This blog post acts as ‘milestone 3’ evidence Doc 3.1, 3.2 & 3.3

iBeacons – our experience in the Hidden Museum app

This post is a longer-than-normal summary of our experience using iBeacons in the Hidden Museum project – intended to document a few pointers for anyone considering iBeacons for their own indoor navigation system. Caution…. this does veer more toward the technical underbelly of the project, rather than the user-facing experience… which is covered in a separate post.

To give this all a bit more context, our basic setup is this: 1) A whole load of iBeacons placed around all three floors of the museum, 2) a device which uses their signals to calculate where it is in the museum, which 3) also uses its own compass to know which way it’s pointing. With these three tools our users can navigate a generated tour of the museum, getting lead from room to room, floor to floor, with an app that reacts when they reach each destination on the tour.

Spoiler alert… the system works!

The beginning

From the outset our fundamental technical goal was to accurately guide a single device on a physical journey around the museum, and have it react when it reaches multiple, flexible locations.


iBeacons emit a signal that can be picked up by a mobile device, and the strength of that signal tells the device a rough distance between it and the iBeacon. With a few assumptions, this suggests that the technology allows a mobile device to pinpoint it’s position within a known indoor space – the two most obvious methods being triangulation-style positioning and hotspot check-in systems.

We opted for the triangulation method – as in theory if it was successful we would be able to apply to the system to any space, and cater for all sorts of navigation methods…. Particularly when used in conjunction with the device compass.


If you’ve started looking into procuring iBeacons you’ll know there are loads of suppliers to pick from, and it’s not easy to see the difference (in many cases there isn’t much). After assessing a range of brands including BlueSense, Glimworm and the beautifully presented Esitmote, we opted for Kontakt… primarily as they have easily replaceable batteries, easily configurable, are the right price, supply in volume (we needed a lot), and are visually discrete. Here’s a list of them:

Supplier URL Volume pricing Price per 100 (ex VAT + Shipping)
Kontakt  Yes  ~$2200 (need to contact for discount)
BlueSense Networks  Yes  £1499
Glimworm beacons  Yes  €1980
 Sensorberg  No  €89 per 3
 Sticknfind  No  $389 per 20
 Estimote  No  $99 per 3
 Gelo  No  $175 per 5

Placement and security

The triangulation method requires a large number of iBeacons throughout the museum building in precise locations – effectively creating a 3D grid of signals. These need to be out of reach and ideally invisible to both the public and staff, as otherwise they might be accidentally moved, tampered-with or even taken… any of which will cause serious navigation bugs in our software. This meant that colourful and attractive iBeacons such as Estimote were out of the picture for this project.

Software choices

We decided to implement the navigation system in Unity 3D. Although it’s primarily a game engine, it is where our core mobile experience lies, it satisfies the cross-platform requirements of real world implementations, it is popular and has super-low barrier to entry with developers, and has very little reliance on proprietary tech.

Triangulation method in Unity

Triangulation mathsSo… how best to implement triangulation in Unity? We take the perceived distance from all ‘visible’ iBeacons, and from that we work out the precise position of the device. After a few sprints of getting neck deep in advanced mathematics, we opted to use Unity’s built-in physics engine to do the heavy lifting for us… using Spring Joints from each iBeacon to automagically position the device on a virtual map, based on perceived distances from each iBeacon in range, allowing Unity to perform the complex maths for us.

Maths and Unity

Early internal testingBelow is a vid of an early test in the Aardman atrium – displaying the device’s perceived position and direction within a virtual 3D model of the building, as the user walks around. The bright-coloured areas on the mobile display are the two doorways. We’re not embarrassed to say that when we got this working it blew our minds a little bit.

Internal early testing


For a triangulation system to work effortlessly the distance data it’s based on needs two things: to be accurate, and to be updated frequently.

IBeacon distance readings tend to be fairly inaccurate – with meaningful variance even in the best conditions (up to 3 metres out), and much worse in bad conditions (physical interference such as pillars or people, and electrical interference such as laptops or mobile devices). Accuracy does tend to increase the closer the iBeacons are to the device.

Frequency is also an issue. Users move around a museum space surprisingly fast… and with our system only able to read signals once a second or so it requires a lot of smoothing on the positioning data to avoid flip-outs every time an update occurs.

The compass

The compass is a tricky little system to wrangle. It is 100% reliant on the accuracy of the device’s hardware and software… which isn’t great when it comes to smart phones and tablets. Even in the best conditions digital compasses are likely to be anywhere up to 20% inaccurate (see – and in bad conditions (such as an indoor space with lots of electrical interference and organic, metal or stone structures everywhere) we’ve witnessed the reading to be out by up to 90 degrees… really not ideal for leading users around a space accurately.

Three-dimensional placement

Map of Bristol Museum and Art GalleryWe knew that iBeacons work on distance, and so therefore the height at which we placed them would make a difference. But – perhaps naively -we didn’t expect this to cause much of an issue, so long as it was consistent. We didn’t take into consideration how powerfully the signals could penetrate through floors and ceilings… and certainly didn’t foresee issues caused by open atriums and balconies.

The Bristol museum and Art Gallery is a complicated building, with vast rooms (some without ceilings), small rooms, corridors,  stairwells, about different 6 levels over three defined floors, and even galleries overlooking other rooms as balconies.

In such a space not only is it difficult to find a consistent position in which to place the iBeacons – there are many opportunities for the device to get a bit confused about what floor it’s on…. particularly when it’s picking up signals from the floors above and below, which happen to be stronger than the closest signals from the room it’s physically in.

With a standard GPS system this would be like expecting it to tell you not just what side of the multi-storey carpark you’re in, but which level of it you’re on. And while iBeacon triangulation is vastly easier in simple environments that can be mapped in two dimensions, it is still possible in three – and we actually did it in the end.


Handling  shortcomings

So… there are a number of technical issues covered in this post, and each of them has led us to simplify and adapt the experience – even though the underlying tech is largely the same. We quickly learned to accept the huge variance in the quality, accuracy and timeliness of the data our navigation is based on, and soften the blow as much as possible so that the user’s experience isn’t affected:

  1. The inaccuracies and signal latency of iBeacons led us to free our user experience from relying on pin-point positioning – and rather round up to the user’s position to just the room they’re definitely in.
  2. The compass inaccuracies lead us to not rely on the compass to lead user around footstep by footstep – but rather to just occasionally find their bearings when stationary.
  3. The issues caused by three dimensional inaccuracies lead us to create navigation logic that only recognises movement between adjacent rooms…. So if that if the triangulation data suddenly starts suggesting the device has changed floor, it only recognises it if the user has just left an appropriate stairwell or lift area.

What’s brilliant about these solutions are how they each have significant emergent benefits to the overall user experience. Our users are not staring at the phone constantly, causing them to trip over bags and other visitors, and they’re using their brains and senses and communication to guide themselves.

All of these user experience developments and considerations will be covered in a separate post on this blog.


While iBeacons may not provide the perfect navigation system, they really aren’t a bad approach for both indoor and outdoor navigation – particularly if you go in with your eyes open to the potential issues. We achieved a slick and functional product, learning loads as we went… and hopefully this post has highlighted the issues to watch out for in your own iBeacon implementations. Thanks for reading!

This blog post acts as ‘milestone 3’ evidence Doc 3.1, 3.2 3.3, 3.4 & 3.5 (video test gif)

3.9.2 Accessibility review for Hidden Museum

Considering the needs of our users is at the heart of all our services and r&d projects are no different.

Littered throughout our digital service work you’ll see reference to Government Digital Service’s ‘Service Manual’ which has helpful guidance on considering accessibility in their resources on ‘assisted digital‘ which they define as:

“Assisted digital is support for people who can’t use online government services on their own.”

We need to consider assisted digital support in two steps, understanding who are the ‘assisted digital users’ and have the ability to provide ‘assisted digital support’. The purpose of offering assisted digital support is to ensure we provide a great experience for all and ensure ‘take-up’ of the project is as fair as possible.

At this point it’s worth noting that we also provide an alternative service for the public by using both visitor assistants who are trained to support our visitors and we offer audio descriptions using the Penfriend technology. We purposely chose galleries and engagement activity that has excellent alternative support in case our project outputs weren’t directly accessible using our approach. We will ensure that the project is delivered within legal and policy constraints , such as the Bristol City Council’s Equality Plan and Equality Act 2010.

Assisted digital action plan

  1. Baseline our % of general visitor who have stated they have a disability using our annual general visitor exit survey to better understand the potential ratio of required support
  2. Test our app and assisted digital in-person support with our inclusion officer
  3. Provide assisted digital instructions to support a visitor ahead of their visit via the website
  4. Ensure visitor assistants are aware of the assisted digital support that may be required and provide appropriate training  via the digital team to support visitors in person (staffing permitting)
  5. Monitor the volume of assisted digital support activity including wait times
  6. Record and monitor feedback by users and experts with the aim of getting ‘fairly or highly satisfied feedback’ in accordance with our standard survey
  7. Test, measure and iterate our app procedures for supporting assisted digital users during our Beta phase
  8. Ensure our support offer is sustainable and consider using volunteers for additional support
  9. Provide guidance that will support any user to complete the tasks of the project on their own
  10. Document steps 1-9 throughout the period of the grant as the information will be valuable to others seeking to provide similar support

Creating characters for our Hidden Museum app

We’ve been beavering away creating characters for our app. Creating a style for this exercised our grey matter somewhat!

We needed to create a look which complimented the very clean visual style that our lead designer for the project, Sarah Matthews created for the app. However, from our years of experience in creating characters for apps and games we know that younger children react much better to characters with an element of realism to them.

In user testing we trialed 3 potential styles:

A photographic style, using photographs of artifacts from around the museum:

Photographic character

A silhouetted style, which fits very well with the UI design, but is much less realistic in style:

Silhouette character

A geometric style, which is stylised like our UI design, but has much more realism:

Geometric character style

The overwhelming vote was for the geometric style, which thankfully our production team all liked too – so we decided to go down this route.

Next we had to decide what our characters should be – we had some ideas and Gail was great at helping us to hone these down. We wanted them to really represent the diversity of the exhibits at the museum, and also be appealing to all age groups and both males and females. So we settled on the ubiquitous dinosaur, George Peppard the boxkite pilot who flies the plane in the main hall, a chinese dragon head as represented in the chinese dragons over the stairs int he main hall, a roman goddess to represent the museum’s wealth of roman artifacts, a female Egyptian Mummy with gorgeous coloured paint and… Alfred the Gorilla (obviously!)

Here are the results!

All characters

Still a bit of work to do on the Mummy to make her a little more female, but more or less there.

Testing the stories/tour – Kids in Museums User Testing Day

At it’s core, our Hidden Museum app takes users on a tour around the Bristol Museum, guiding them to places in the museum they might not ordinarily go, and revealing hidden information as they progress through the game.

To simulate this experience our assistant creative director, Rich Thorne, behaved as the app, using one of the story-tours we had created; he did this by leading the group around the museum around a range of artifacts on a theme (in this case ‘horses’) and engaging them in conversation about each artifact as they went – explaining how they were linked and telling them an interesting story about each artifact once it was found.Kids on tour questions

The aim of this was to:

  • See how long it took them to get round the journey as a group.
  • Get feedback from the children on any standout points of interest on the tour.

Key observation points for the supervisors of the tour were:

  • Is the tour engaging, interesting?
  • Is the tour too long/short?
  • Which object was the favourite on the journey?
  • Have they visited the museum before?
  • Have they ever been to the top of the museum?
  • Is it more fun having a checklist? As opposed to walking around the museum looking at everything.
  • Do they have ideas for trail themes which would they like to go on?

Kids in museums kids look at cabinet

What we found:

We discovered that the tours which we had devised contained too much identification in advance about what the group were going to see. For example, the horse trail said in advance that the group are going to find objects about horses in them. This was taking some of the excitement out of the tour. We needed to find a way to work on the themes in order to broaden them out to become more subjective and feel less curated.

We also learnt that this kind of curation meant that we were not making the most of the ‘hidden’ metaphor of our app. Whilst we were leading the group to areas they might not otherwise have gone to, it did not allow for enough free exploration of rooms and free thought around the objects themselves – getting lost in the museum and ‘accidentally’ discovering something hidden should be a desirable side effect of the app so we needed to find a way to allow for this.

The groups, particularly the kids, were very interested in collecting and counting. They were also particularly keen on emotive subjects – picking out items which were ‘weird’ or ‘strange’ or ‘scary’ or ‘cute’.

Kids in Museums – User Testing Day – Overview

On Friday 21st November, the Hidden Museum team were lucky enough to have the opportunity to be a part of a Kids in Museums take over day. Kids in Museums are an independent charity dedicated to making museums open and welcoming to all families, in particular those who haven’t visited before.

Kids in Museums Logo

This was a great opportunity for us to test our app in production with real users, over 30 kids and 6 adults who had never seen the app before! A real coup for us as developers – a chance to get some real insight into how our app might be received on completion.

However we were conscious that we did not want to take advantage of the day and it’s main aim of making museums more open to kids and families. So we took great care to plan the day with the education coordinators at the museum, Naif and Karen, to ensure that we were providing a fun experience for the kids, as well as testing elements of our app. As there were adults supervising the kids we also tested all elements with the supervising adults to get an impression of a family group’s opinion of each element (rather than kids opinions only) – very important as our app is aimed at a mixed-age group.

Kids on tour

Naif and Karen suggested that we warmed them up with a ‘fingertips explorers’ activity – where the kids felt an artifact blindfold, and had to describe it to their friends – their friend’s guessing what it could be. (A fun game which the kids really enjoyed and we have used as an influence for one of our games as a result!)

We decided that after the fingertips explorers warm up the kids would be ready for app testing!

Kids in museums kids looking at zebra

As the app was not yet complete, we decided to test elements of the app broken down, rather than the experience as a whole. We decided to break our testing down into 4 elements:

Testing the stories/tour

Testing the iBeacons/compass interface

Testing the UI

Testing the games

This decision was reached mainly through necessity since the app was not complete. However, we found that it really worked for us, and allowed us to get some really in depth insight into our app. This was for two reasons – firstly it allowed us to break the group of children down into smaller more manageable groups so we were able to have real conversations with each of the children in turn – and secondly it allowed us to assess which elements of the app they struggled with the most and so exactly where we should be making our improvements.

There were lots of great testing outcomes to the day around each of the app elements outlined above – I’ll update the blog with how we tested each of the app elements (and the associated learnings) over the coming days.