After much planning, preparation and excitement the week of 25-29th June 2018 was the building of our shop refit at Bristol Museum & Art Gallery. The first time in our history that we’re commissioned a specialist cultural heritage shop fitting firm, ARJ CRE8. It is the end of the week and many people have worked very long hours to smash out the out shop fittings and build us a shop that we can be proud of…and most importantly increase profit.
The shop is complete and ready for customers on Saturday. We have a small snagging list and need to visual merchandise properly but this is scheduled for early next week. For now we just need to ensure 100% of products are available and nothing is missing /left in storage.
Today is a proud moment
Thank you to everybody who encouraged us throughout the week and/or lent a hand. A special thanks also to Bristol Museums Development Trust who agreed to significantly contribute to the cost of the project. I can’t thank Andy, Jon and the team from ARJ CRE8 enough for their professionalism, problem solving ability and relentless cheerfulness!
Now let’s go out and prove you don’t need a stockroom…..hehe
29th June 2018
07:20-10:00 GO! GO! Go! Moved as much products as possible from storage to the shop and our holding space. Big thank you to the staff who volunteered some time to make stuff around
07:45 – 17:00 Finished up adding doors to bays, shelving, lighting adjustments and painting
11:00 accessories arrive from courier to enable visual merchandising of the shop
12:00-16:30 a few of our international volunteers came to the rescue and helped us prepare shelving and get products out on the shelves.
15:00-17:00 move the pop-up shop fittings back into the shop and setup the tills and digital signage
15:30 sold to our first customer despite being technically closed! A visitor really wanted our Millerds Map so I showed him our new bay and we made the sale!
17:00-18:30 vacuum, clean and move out any non-critical products and accessories
18:31 Shop is ready to open Saturday morning
28th June 2018
07:30-10:00 move stock from deep storage
10:00-13:00 move bay units into position
13:00-18:30 wire and light each bay, reconnect air-handling which appears to have been out of action for years, finish cutting ceiling tiles
17:00-18:30 move products to outside shop ready for restocking Friday morning
27th June 2018
Build bay bases and measure out precise bay locations
Ordered accessories for displaying products
Wire networking to shop
Empty final waste to skip
26th June 2018
07:30 Ceiling fitter arrives onsite to fit ceiling tiles on existing tracks. Quickly discovers that all the track is obsolete and needs to replace entire track
08:00 Zak tears shirt moving pallet full of ceiling tiles
08:30-10:00 set up pop up shop in front hall. Shop takes £496.35 gross during day
08:30-11:00 Replace obsolete circuit board
08:00-21:00 Continue work to perimeter walls. Edge of ceiling complete and 50% of ceiling track fitted
25th June 2018
06:30 Skip arrives…in wrong location…… 2hr wait for move
8am Contractors arrives and unloads tools
08:30 Contractor begins to gut existing shop walls and ceiling
10:00 Retail team begin to review products for pop-up shop which will run 26-29th June
09:00 Sparks begins to review wiring and remove old…discover circuit board is ancient so we get in Carters to assess and agree to replace on 26th
10:00 Waste for skip removed to front of building and loaded into waiting skip
14:15 Building Practice team called to assess wall
15:00 Large lorry of 38 shop bays arrives and is unloaded
16:00 Stone mason’s make wall safe by carefully taking wall pillar apart without further damage to each stone which is then stored
17:00 Second large van arrives to deliver central bay units and small fittings which is unloaded
17:45 Remaining waste loaded into van
17:45 to 18:15 Clean up of route
19:30 Evening private hire event starts
Team of 6 empty all shop products and move to holding location
Old fittings e.g shelving removed to storage or for recycling
The shop just hours before the refit to rip out the stockroom, install new bays and maximise the space
This is an Interview with Tom Marshman about an alternative audio tour available at M Shed
Q: Can you describe the new resource you have created?
A: Working together with Rowan Evans (sound artist) we have created an alternative audio tour of the M-shed.
The tour connects up some of the stories I have collected for my performance work within the exhibition about Bristol, sharing stories I heard when interviewing older LGBT people in Bristol about the stories that lie at the roots of their LGBT identity.
The stories are funny and touching, and I’ve presented them very lyrically so the tour almost becomes a long poem that moves you around the first and ground floors of the M-shed.
If you would like to do the tour the audio devices are kept behind the information desk on the ground floor, all you need to do is ask for one from a member of staff. The audio devices are encased in vintage matches, so you collect your headphones and match box and move around the space.
The piece was originally a live performance walk around the old city, around St Nicholas market so a lot of the stories are based there, most significantly the Radnor Hotel, which was a known gay venue from the 1930’s onwards.
Q: What is it about audio that made you decide to use this medium?
A: Each story is represented by the sound of a match striking; the stories burn brightly and quickly like a match, sharing a story before you move on to the next story. The idea for this came from one particular story where a man met his life partner by being asked for a light.
I really wanted people to feel like they were heading back in time with this work and that there was a retro vibe going on. I didn’t want them walking around the galleries with cutting edge technology I wanted something more tactile and evocative of stories people tell, this is why I chose the matchbox.
Q: How does your product differ from a usual museum audio guide?
A: In my work I am not so concerned with facts and figures, what I want to do is tell a good story and in particular the stories of older LGBT people which could soon be lost.
I think they add a new texture to the exhibits in the M Shed, bringing out the human stories within the objects and focusing on LGBT stories. LGBT stories are often whitewashed in museum versions of history, where we told the stories of the ‘powerful white upper class men’ instead. This work, I think, helps address this imbalance, and adds a new range of stories so that M Shed represents the diverse and exciting Bristol we live in.
These are stories I think everyone will enjoy hearing the stories, although some of the language is a bit racy so over 16’s only!
Q: Do you think the technology presents any barriers to access?
A: As an artist I’m based at the Pervasive Media Studio within Watershed Cinema where many artists and technologists are exploring ways to work with technology in new and exciting ways.
Amusingly, I am a technophobe, so for me to understand it, it has to be very simple. Because of this, what we have created is super easy to use, the only thing you have to do is turn it on, find the right volume, and follow the directions of where to move to within the audio tour. If people have smartphones they can also request a link or scan a QR code, to find the tour online. So technically they don’t need to have the matchbox, but I feel that spoils the fun slightly!
The important thing for me, when I am working with technology, is that it doesn’t get in the way of the stories and that the technology supports it, rather than presenting a barrier. And if anyone finds any teething problems, then I hope they’d mention it to the information desk so we can improve accessibility.
Q: How do you think the museum could learn from this project when developing their own audio resources?
A: The M-shed is not just about Bristol as a place, it’s also about the people of Bristol. And I love that it places importance on a wide-range of people too, not just people that are deemed to be ‘the great and the good’. I think our project reinforces that and tells us about a group of people whom you don’t often hear about.
I hope adding this will bring new LGBT audiences into museums to connect them to our history, as well as introducing non-LGBT museum-goers to it, all in an engaging and fun way.
As an artist I love working in museums because they are rich in stories, and I think it’s important to find new ways to share and celebrate within the museums.
Move Over Darling talks about people’s lives, deaths, loves, friendships and sex lives in a way that many museums don’t. The way our society treated LGBT people up until very recently has become a shocking and shameful secret history, and projects like this one can help museums tackle these difficult issues as well making sure the positive stories of LGBT people are not lost.
There’s a personable quality to the work I make too. All the people I tell my stories about on the audio tour I have met, I know them and we have exchanged our stories in face-to-face conversation. Though you don’t get to hear my stories on the tour, the human exchange during this research has indelibly influenced and shaped how I tell these stories. Sadly a big contributor to the content passed away last year, it is nice that his stories are present in the museum in this way.
Q: How can people access the content?
A: You can collect the matchboxes from the front desk at the M-shed anytime they are open, you can also find it online here and listen as you walk around the museums.
This is an ongoing part of the exhibition so hopefully my voice will be in the museum forever or at least until it doesn’t feel relevant anymore. Perhaps in a few years I will add more stories, we’ll see!
Between 25-29th June 2018 we’ll be closing our shop to gut the space and build a new and improved customer offer. I thought I’d take the time to explain the details of the project just ahead of the actual build.
The shop was last refit in the early 1990s and in the past 18-24 months it has been a daily struggle to grow the business within those dated constraints which are primarily:
Space isn’t used effectively both behind the scenes (stockroom) or in the public area of the shop and cannot be optimised further
the fittings are very dated and the super wood effect weakens our brand
a partial 2016 refit saw improvements to sales by introducing LED lighting, dedicated nesting tables and a bookshelf area which increased sales by over 100% for those categories
although the ceiling lighting has dramatically improved the general vibe, the majority of products are still not lit well which doesn’t show products in the best way
the bays are all slatwall which constraints our options for displaying products, limits the visual merchandising and has poor space/density
We went out to tender and successfully secured the expertise for design as build of ARJ-CRe8. Originally we hoped to complete the project earlier this year but we missed the narrow window. As the exhibition exits through the shop we can only do the work between exhibitions so a June date was set.
We had a reasonable budget, a contractor and a GO date. As with all my collaborative projects we use Basecamp to communicate with all the project team and to keep other interested parties in the loop. I love tools like this as they cut down on meetings and keep a full history of questions and decisions that we can refer back to. It means when we do meet face to face it is super productive. Between February and April we worked together on the design, staff feedback and drawings. In total we’ve had five evolutions of the original design. Each iteration is an incremental improvement to the previous direction and catching missed constraints.
I was keen to completely remove the traditional “till” area as I believe this isn’t a productive use of space and the future of retail will be till free. However we’re not quite into the future so my colleagues successfully convinced me that being an early adopter isn’t always best. We will test a till free approach in the near future!
Now that the design is in the final build phase we know that the refit will:
remove the stockroom to give us 20% more shopfloor space and 31 total bays with under unit storage
allow us to provide a better customer experience with a shop designed and built for a heritage customer
use the removal of the stockroom to properly implement an effective buying and stockholding procedure – hold less stock to keep as much cash free as possible and not own risky products
increase serving from one cashier to up to two at the same time which has long been an issue
improve category management by having clearly defined zones
allow us to introduce improved security measures [redacted]
introduce a shop that is aligned to our brand with new colour ways and point of sale
improve flow from the exhibition area and give a better connected interaction of the exhibition and its related products
increased high price point products with lockable units
allow us to study what we can maximise in this space to inform Project Alfred, our project which seeks to redevelop the building eg should we move the shop in that project or leave it by the exhibition space
We have been busy with lots of small but important detail such as moving key infrastructure, planning how to run a pop up shop in the front hall during the work and how to work with the exhibition team who will be in derig mode.
We expect a significant increase in sales and the hard work begins once the build is complete. We’ll have transformed the space which are in effect is our foundations and we can now set about building a very successful retail offer from these strong beginnings.
At Bristol Culture we aim to collect, preserve and create access to our
collections for use by present and future generations. We are increasingly dealing with digital assets amongst these collections – from photographs of our objects, to scans of the historical and unique maps and plans of Bristol, to born-digital creations such as 3D scans of our Pliosaurus fossil. We are also collecting new digital creations in the form of video artwork.
One day we won’t be able to open these books because they are too fragile – digital will be the only way we can access this unique record of Bristol’s history, so digital helps us preserve the physical and provides access. Inside are original plans of Bristols most historic and well-known buildings including the Bristol Hippodrome, which require careful unfolding and digital stitching to reproduce the image of the full drawing inside.
With new technology comes new opportunities to explore our specimens and this often means having to work with new file types and new applications to view them.
This 3D scan of our Pliosaurus jaw allows us to gain new insights into the behavior and biology of this long-extinct marine reptile.
So digital assets are helping us conserve our archives, explore our collections and experience new forms of art, but how do we look after those assets for future generations?
It might seem like we don’t need to worry about that now but as time goes by there is constant technological change; hardware becomes un-usable or non-existent, software changes and the very 1s and 0s that make up our digital assets can be prone to deteriorating by a process known as bitrot!. Additionally, just as is the case for physical artifacts, the information we know about them including provenance and rights can become dissociated. What’s more, the digital assets can and must multiply, move and adapt to new situations, new storage facilities and new methods of presentation. Digital preservation is the combination of procedures, technology and policy that we can use to help us prevent these risks from rendering our digital repository obsolete. We are currently in the process of upskilling staff and reviewing how we do things so that we can be sure our digital assets are safe and accessible.
It is clear we need to develop and improve our strategy for dealing with these potential problems, and that this strategy should underline all digital activity where the result of that activity produces output which we wish to preserve and keep. To rectify this, staff at the Bristol Archives, alongside Team Digital and Collections got together to write a digital preservation policy and roadmap to ensure that preserved digital content can be located, rendered (opened) and trusted well into the future.
Our approach to digital preservation is informed by guidance from national organisations and professional bodies including The National Archives, the Archives & Records Association, the Museums Association, the Collections Trust, the Digital Preservation Coalition, the Government Digital Service and the British Library. We will aim to conform to the Open Archival Information System (OAIS) reference model for digital preservation (ISO 14721:2012). We will also measure progress against the National Digital Stewardship Alliance (NSDA) levels of digital preservation.
A safe digital repository
We use EMu for our digital asset management and collections management systems. Any multimedia uploaded to EMu is automatically given a checksum, and this is stored in the database record for that asset. What this means is that if for any reason that file should change or deteriorate (which is unlikely, but the whole point of digital preservation is to have a mechanism to detect if this should happen) the new checksum won’t match the old one and so we can identify a changed file.
Due to the size of the repository, which is currently approaching 10Tb, it would not be practical to this manually, and so we use a scheduled script to pass through each record and generate a new checksum to compare with the original. The trick here is to make sure that the whole repo gets scanned in time for the next backup period because otherwise, any missing or degraded files would become the backup and therefore obscure the original. We also need a working relationship with our IT providers and an agreed procedure to rescue any lost files if this happens.
With all this in place, we know that what goes in can come back out in the same state -so far so good. But what we cant control is the constant change in technology for rendering files – how do we know that the files we are archiving now will be readable in the future? The answer is that we don’t unless we can migrate from out of date file types to new ones. A quick analysis of all records tagged as ‘video’ shows the following diversity of file types:
(See the stats for images and audio here). The majority are mpeg or avi, but there is a tail end of various files which may be less common and we’ll need to consider if these should remain in this format or if we need to arrange for them to be converted to a new video format.
Our plan is to make gradual improvements in our documentation and systems in line with the NDSA to achieve level 2 by 2022:
The following dashboard gives an idea of where we are currently in terms of file types and the rate of growth:
Herding digital sheep
Its all very well having digital preservation systems in place, but the staff culture and working practices must also change and integrate with them.
In theory, all digital assets should line up and enter the digital repository in an orderly and systematic manner. However, we all know that in practice things aren’t so straightforward.
Staff involved in digitisation and quality control need the freedom to be able to work with files in the applications and hardware they are used to without being hindered by rules and convoluted ingestion processes. They should to be allowed to work in a messy (to outsiders) environment, at least until the assets are finalised. Also there are many other environmental factors that affect working practices including rights issues, time pressures from exhibition development, and skills and tools available to get the job done. By layering new limitations based on digital preservation we are at risk of designing a system that wont be adopted, as illustrated in the following tweet by @steube:
So we’ll need to think carefully about how we implement any new procedures that may increase the workload of staff. Ideally, we’ll be able to reduce the time staff take in moving files around by using designated folders for multimedia ingestion – these would be visible to the digital repository and act as “dropbox” areas which automatically get scanned and any files automatically uploaded an then deleted. For this process to work, we’ll need to name files carefully so that once uploaded they can be digitally associated with the corresponding catalogue records that are created as part of any inventory project. Having a 24 hour ingestion routine would solve many of the complaints we hear from staff about waiting for files to upload to the system.
Providing user-friendly, online services is a principle we strive for at Bristol Culture – and access to our digital repository for researchers, commercial companies and the public is something we need to address.
We want to be able to recreate the experience of browsing an old photo album using gallery technology. This interactive uses the Turn JS open source software to simulate page turning on a touchscreen featuring in Empire Through the Lens at Bristol Museum.
Visitors to the search room at Bristol Archives have access to the online catalogue as well as knowledgeable staff to help them access the digital material. This system relies on having structured data in the catalogue and scripts which can extract the data and multiemdia and package them up for the page turning application.
But we receive enquiries and requests from people all over the world, in some cases from different time zones which makes communication difficult. We are planning to improve the online catalogue to allow better access to the digital repository, and to link this up to systems for requesting digital replicas. There are so many potential uses and users of the material that we’ll need to undertake user research into how we should best make it available and in what form.
There are various versions of a common saying that ‘if you don’t measure it you can’t manage it’. See Zak Mensah’s (Head of Transformation at Bristol Culture) tweet below. As we’ll explain below we’re doing a good job of collecting a significant amount of Key Performance Indicator data; however, there remain areas of our service that don’t have KPIs and are not being ‘inspected’ (which usually means they’re not being celebrated). This blog is about our recent sprint to improve how we do KPI data collection and reporting.
The most public face of Bristol Culture is the five museums we run (including Bristol Museum & Art Gallery and M Shed), but the service is much more than its museums. Our teams include, among others; the arts and events team (who are responsible the annual Harbour Festival as well as the Cultural Investment Programme which funds over 100 local arts and cultural organisations in Bristol); Bristol Archives; the Modern Records Office; Bristol Film Office and the Bristol Regional Environmental Recording Centre who are responsible for wildlife and geological data for the region.
Like most organisations we have KPIs and other performance data that we need to collect every year in order to meet funding requirements e.g. the ACE NPO Annual Return. We also collect lots of performance data which goes beyond this, but we don’t necessarily have a joined up picture of how each team is performing and how we are performing as a whole service.
The first thing to say is that they’re not a cynical tool to catch out teams for poor performance. The operative word in KPI is ‘indicator’; the data should be a litmus test of overall performance. The second thing is that KPIs should not be viewed in a vacuum. They make sense only in a given context; typically comparing KPIs month by month, quarter by quarter, etc. to track growth or to look for patterns over time such as busy periods.
A great resource we’ve been using for a few years is the Service Manual produced by the Government Digital Service (GDS) https://www.gov.uk/service-manual. They provide really focused advice on performance data. Under the heading ‘what to measure’, the service manual specifies four mandatory metrics to understand how a service is performing:
cost per transaction– how much it costs … each time someone completes the task your service provides
user satisfaction– what percentage of users are satisfied with their experience of using your service
completion rate– what percentage of transactions users successfully complete
digital take-up– what percentage of users choose … digital services to complete their task
Added to this, the service manual advises that:
You must collect data for the 4 mandatory key performance indicators (KPIs), but you’ll also need your own KPIs to fully understand whether your service is working for users and communicate its performance to your organisation.
Up until this week we were collecting the data for the mandatory KPIs but they have been somewhat buried in very large excel spreadsheets or in different locations. For example our satisfaction data lives on a surveymonkey dashboard. Of course, spreadsheets have their place, but to get more of our colleagues in the service taking an interest in our KPI data we need to present it in a way they can understand more intuitively. Again, not wanting to reinvent the wheel, we turned to the GDS to see what they were doing. The service dashboard they publish online has two headline KPI figures followed below with a list of the departments which you can click into to see KPIs at a department level.
Achieving a new KPI dashboard
As a general rule, we prefer to use open source and openly available tools to do our work, and this means not being locked into any single product. This also allows us to be more modular in our approach to data, giving us the ability to switch tools or upgrade various elements without affecting the whole system. When it comes to analysing data across platforms, the challenge is how to get the data from the point of data capture to the analysis and presentation tech – and when to automate vs doing manual data manipulations. Having spent the last year shifting away from using Excel as a data store and moving our main KPIs to an online database, we now have a system which can integrate with Google Sheets in various ways to extract and aggregate the raw data into meaningful metrics. Here’s a quick summary of the various integrations involved:
Data capture from staff using online forms: Staff across the service are required to log performance data, at their desks, and on the move via tablets over wifi. Our online performance data system provides customised data entry forms for specific figures such as exhibition visits. These forms also capture metadata around the figures such as who logged the figure and any comments about it – this is useful when we come to test and inspect any anomalies. We’ve also overcome the risk of saving raw data in spreadsheets, and the bottleneck often caused when two people need to log data at the same time on the same spreadsheet.
Data capture directly from visitors: A while back we moved to online, self-completed visitor surveys using SurveyMonkey and these prompt visitors to rate their satisfaction. We wanted the daily % of satisfied feedback entries to make its way to our dashboard, and to be aggregated (both combined with data across sites and then condensed into a single representative figure). This proved subtly challenging and had the whole team scratching our heads at various points thinking about whether an average of averages actually meant something, and furthermore how this could be filtered by a date range, if at all.
Google Analytics: Quietly ticking away in the background of all our websites.
Google sheets as a place to join and validate data: It is a piece of cake to suck up data from Google Sheets into Data Studio, provided it’s in the right format. We needed to use a few tricks to bring data into Google Sheets, however, including Zapier, Google Apps Script, and sheets Add-ons.
Zapier: gives us the power to integrate visitor satisfaction from SurveyMonkey into Google Sheets.
Google apps script: We use this to query the API on our data platform and then perform some extra calculations such as working out conversion rates of exhibition visits vs museum visits. We also really like the record macro feature which we can use to automate any calculations after bringing in the data. Technically it is possible to push or pull data into Google Sheets – we opted for a pull because this gives us control via Google Sheets rather than waiting for a scheduled push from the data server.
Google Sheets formulae: We can join museum visits and exhibition visits in one sheet by using the SUMIFS function, and then use this to work out a daily conversion rate. This can then be aggregated in Data Studio to get an overall conversion rate, filtered by date.
Sheets Add-Ons: We found a nifty add-on for integrating sheets with Google Analytics. Whilst it’s fairly simple to connect Analytics to Data Studio, we wanted to combine the stats across our various websites, and so we needed a preliminary data ‘munging’ stage first.
Joining the dots…
1.) Zapier pushes the satisfaction score from SurveyMonkey to Sheets.
2.) A Google Sheets Add On pulls in Google Analytics data into Sheets, combining figures across many websites in one place.
3.) Online data forms save data directly to a web database (MongoDB).
4.) The performance platform displays raw and aggregated data to staff using ChartJS.
5.) Google Apps Script pulls in performance data to Google Sheets.
6.) Gooogle Data Studio brings in data from Google Sheets, and provides both aggregation and calculated fields.
7.) The dashboard can be embedded back into other websites including our performance platform via an iframe.
8.) Good old Excel and some VBA programming can harness data from the performance platform.
We’ve been testing out Google Data Studio over the last few months to get a feel for how it might work for us. It’s definitely the cleanest way to visualise our KPIs, even if what’s going on behind the scenes isn’t quite as simple as it looks on the outside.
There are a number of integrations for Data Studio, including lots of third party ones, but so far we’ve found Google’s own Sheets and Analytics integrations cover us for everything we need. Within Data Studio you’re somewhat limited to what you can do in terms of manipulating or ‘munging’ the data (there’s been a lot of munging talk this week), and we’re finding the balance between how much we want Sheets to do and how much we want Data Studio to do.
At the beginning of the sprint we set about looking at Bristol Culture’s structure and listing five KPIs each for 1.) the service as a whole; 2.) the 3 ‘departments’ (Collections, Engagement and Transformation) and 3.) each team underneath them. We then listed what the data for each of the KPIs for each team would be. Our five KPIs are:
Cost per transaction
Each team won’t necessarily have all five KPIs but actually the data we already collect covers most of these for all teams.
Using this structure we can then create a Data Studio report for each team, department and the service as a whole. So far we’ve cracked the service-wide dashboard and have made a start on department and team-level dashboards, which *should* mean we can roll out in a more seamless way. Although those could be famous last words, couldn’t they?
Any questions, let us know.
Darren Roberts (User Researcher), Mark Pajak (Head of Digital) & Fay Curtis (User Researcher)
By Tanja Aminata Bah, MA Curator-in-training at M Shed / Social History Team
Discover Black History in St. Paul’s via a story map and walks
Always wanted to find out more about your local area? Ever wondered where the Bamboo Club was or where the St.Paul’s riots started? St. Paul’s is full of exciting stories waiting to be discovered with this new handy introduction to Black History in the area.
Over the course of the last year, I have been placed with Bristol Culture’s Social History Team at M Shed and Blaise Castle House Museum as part of my MA Curating at UWE Bristol. My interest in Black History, engagement and innovation through digital media in museum spaces lead me to my creating a story map reimagining, preserving and documenting key Black Bristolian stories as my final project. The map offers not just stories, which I gathered via a call out for information, but also showcases some unique, not yet published archival imagery of St. Paul’s and people in the area.
The map is fully integrated with Google Maps for Android and iPhones and can be used here in your browser.
How to use the map?
The map has different layers, which can be navigated via clicking (this icon). The map works best on mobile devices such as Android and iPhones. Simply open this blog post in your browser and click the enlarge icon in the right corner. This will lead you to the Google Maps integration, where you can scroll through the tours and layers of the map on the go.
Walking tours online
I have designed three unique walking tours, giving you insights while you explore the area. If you enable your GPS signal on your phone the tours will even lead you from stop to stop.
Only have an hour to spare? Essential St. Paul’s is your brief 101 to St. Paul’s African Caribbean history since the 1950s. The hour-long stroll follows a leisurely flat course around the heart of St. Paul’s, Grosvenor road? and City Road and offers plenty to see in a short time. If you haven’t got internet on the go you can also download and print out a leaflet here.
If you want to explore for a bit longer you can try out the walk Before The Riots. The walk is flat and will lead you from the Bamboo Club near Portland Square to the Empire Sports Club near St. Agnes, exploring St. Paul’s between 1950 and 1980.
Want it all? The Full Walk will lead you from the Bamboo Club to Ashley Parade on a 2hour uphill course. You will learn all about the African Caribbean community in St.Paul’s and Montpellier before heading to St.Werburghs to learn about two Victorian and Edwardian Black Bristolian families.
St. Paul’s Vibes
While you are out and about exploring you can listen to a selection of my favourite tracks that remind me of St. Paul’s, including many Bristolian artists such as massive attack alongside classics of Calypso and Roots Reggae, which enjoyed a popular following in St. Paul’s.
Finding out more
Got curious and want to find out more about some stories? Here is a handy list to find out more about Black History in and around St. Paul’s.
Dresser, M. and Fleming, P. (2007) Bristol. Ethnic Minorities and the City 1000 – 2001 (England’s past for Everyone, Bristol). Stroud: Phillimore & Co Ltd.
Stephenson, P. and Morrison, L. (2011) Memoirs of a Black Englishman. Bristol: Tangent Books.
Dresser, M. (2013) Black and White on the Buses: The 1963 Colour Bar Dispute in Bristol. London: Bookmarks Publications.
The project would not have been possible without my mentor Catherine Littlejohns, curator of Social History, as well as the kind support of Bristol Museum, M Shed, Bristol Archives and UWE staff alongside local stakeholders. Thank you!
Tanja Aminata Bah (Twitter: @jakumata, email@example.com) is a MA Curating Student at UWE Bristol and is placed as curator- in- training with M Shed and the Social History team. In her studies, she is interested in the crossroads between history, representation and digital developments in the heritage field. She holds a B.A. in History and African Studies from University of Cologne. Her studies are supported by the Rosa Luxemburg Foundation.
A place to translate our in-house exhibitions for an online audience, we worked with Mike and Luke at Thirty8 Digital to create a narrative structure with scroll-through content and click-through chapters on WordPress. They built in lovely features such as object grids, timelines, slideshows, maps and quotes.
(For the What is Bristol Music? exhibition opening in May 2018, we’re using WP plugin Gravity Forms to collate peoples’ experiences and pictures of the Bristol music scene to be featured in the physical exhibition. Chip in if you have a story to tell.)
So far, we’ve found the content and arrangement really depends on the exhibition. The idea isn’t to simply put the physical exhibition online (I say ‘simply’, as if it would be) but instead to use the format and content of the exhibition to engage with people in a different environment: albeit one where we’re competing with a thousand other things for people’s attention. Exhibitions which have been and gone have been slightly more challenging, as the content was never intended for this use and has needed some wrangling. The more we use it though the smoother the process is getting, now that we know what we need and it being on teams’ plans as something to consider.
We’re still in the early stages of reviewing analytics to see how people are using it. Initial results are heartening, though, with a few thousand visits having had minimal promotion. At the moment most people are finding it from our what’s on pages (where most of our traffic to the main website is anyway) and we’re thinking about what campaigns we can do to get it out there more.
Hi! I am Tanja, a current MA Curating student at UWE, placed as a Curator in training with the Social History Team in Bristol Culture since January 2017. I am interested in engagement work, black history and innovation through digital media in museums. Aside from assisting the Social History Team, I became involved with mainly digital developments, writing up a project proposal to redevelop the “Big Question Displays” in M Shed to address Brexit on a limited budget, as part of my course and writing up a new online collection highlight on “Green Bristol”. For my final project I aimed to contribute to the service through piloting something new and innovative, but rather budget friendly for the service, that crosscuts my interests.
I decided to develop a customized Google map to document black History in St. Paul’s, capturing some key stories of prominent Black Bristolians that were and are active in the area. Initially planned as a walking tour, one motivation for me was to preserve these stories in an ever-changing St. Paul’s and reimagine these for an online audience, who might want to access the map remotely and as a “gateway” of getting first insides into Black History. While focussing on the African Caribbean community from the 1950s I wanted to design something speaking to digital natives and older generations alike. One of my inspirations was “Black Histories London”, a research project capturing the black presence in London from 1958 to 1981 by Rob Waters, who works for University of Sussex and the Sussex Humanities Lab.
I started contacting local stakeholders in June to September, reached out via our Bristol Museums blog and researched intensively in the archives, while tracing back old material from other service affiliated projects such as the Black Bristolian Learning Resource and the Bristol Black Archives Partnership, to combine information into this new digital offering.
Over the last months, I developed a prototype, that I wanted to share with you as an early beta test to gather feedback. The prototype will be trailed with some members of the Bristol Culture Youth Panel on Wednesday 8th November 2017, in a feedback workshop, as well. Although this prototype is fully functional, it is not yet revised in its size and scope as such. Texts for the stations are still earliest drafts, some pictures will change, and some stations will not end up in the final version.
At the moment I am looking at the following questions:
How is the layout and design working?
Should I use multiple layers sorting stories after themes, instead of one full layer?
Should I do a second map for possible walking routes or work in one map with layers?
How are the texts and the stations? Are they fully understandable? Does it contain unneccessary information?
The project is already fully integrated and tested into the google map app for Android and iPhones. To access it on the go, the user needs to open this blog post (and later the final blog post) with their browser and then click the “fullscreen”/ enlarge icon. This should automatically open the map on the google maps app.
The end product will be offered to the public via a blog post in late November and will hopefully be supplemented by a “Discover and Walk your own” Guide/Booklet as a pdf download. I am currently also seeking out possibilities to further integrate the legacy of the project in form of the map as a QR code label into M Shed.
It would be great to hear your feedback and ideas for improvement as well as general thoughts on this project. I have created a google survey to fill with your impressions and ideas. The form is completely anonymous and does not require any personal data here!
After the Youth Panel Workshop I will try to start systematically evaluating the different tools and map types I discovered and how this pilot is proceeding.
Losing survey data is a pain – unfortunately the events team lost six events worth of survey data they collected using off-line surveys. The team used iPads (cost per iPad is c.£320) to conduct surveys on software which was sourced outside our team (I’m not sure what system it was). They used the software on the basis that it claimed to offer off-line surveys i.e. without an internet connection /wi-fi. The idea was that they data could then be uploaded once the iPad was connected to the internet. When they came to do so, however, the data was simply not there and they had lost it all.
The events team came to the digital team this year to ask if we could help them with the public surveys for the 2017 Harbour Festival. The festival is held across much of Bristol City Centre and therefore in order to conduct surveys digitally using iPads we would need to do so without having to rely on having a wifi connection.Of course, one option would be to conduct the surveys with good old pen and paper, but as a digital-first service we were happy to accept the challenge.
One of the main reasons we want to avoid paper surveys is because it is time consuming and difficult to digitise the survey results. It requires someone to sit at a computer and manually input results. Staff resources are often limited and this is a job we’d rather not have to give ourselves. Practically, paper can also be unruly, there are issues with handwriting legibility and they are easy to lose when relying on volunteers to collect them so a digital a solution is very desirable.
The challenge came down to finding the right software that I could install on the iPad and test, and that didn’t cost too much. Our usual platform for conducting surveys on iPads where we do have an internet connection is SurveyMonkey (we pay for the gold subscription £230 per year). Unfortunately, off-line surveys are not a feature available on SurveyMonkey.
These are a few Apps I tried to use but weren’t right for one reason or another:
Qualtrics – poor trialling options and expensive for full features £65 for one month or £435 for one year
iSurveys (harvestyourdata.com) – free account is limited and their main website is difficult to use and I couldn’t work out how much the full feature product was
SurveyPocket by QuestionPro – trial difficult to use and full feature pricing only available by contacting the company
The one I almost went for: QuickTap Survey & Form Builder – good pricing options from $16 per month and the trial is OK
So, after trawling the internet and the App Store for options the one we went for is an App called Feed2go (www.feed2go.com)
Quick Note: Before I speak about the virtues of Feed2go, I have to make it clear that it is currently only available on the Apple App Store; it is not available on Android devices in the Play Store (quicktap surveys app is available on Android).
I downloaded the feed2go App onto my iPad and and it was ready to go with pretty much all features available – certainly enough to get a feel of whether it was right. Most crucially on the basic/trial version you can conduct off-line surveys and test if the data is secure and can be successfully uploaded – we I did and it worked. A major advantage of the feed2go app is that to access all the app’s features (Pro) is a very reasonable subscription of £2.49 for 1 month; £4.99 for 3 months; or £12.49 for 1 year. At these costs there is virtually no risk in trying the Pro subscription.
If anyone is interested in trying the App, I would suggest going ahead and downloading and having an explore. There are just a couple of things I will highlight:
The user interface is nice and clean and easy to use
The options for question structures is OK and covers most bases but it is more limited than something like SurveyMonkey
Some of the navigation in the App can be a bit clunky especially when designing survey forms, but once you get used to it then it’s fine
Probably the most significant feature of feed2go to mention is trying to use the same survey on one device. This is not a particular strong suit of feed2go but it does work. Basically you need to download feed2go on each device you have and then share the survey between them using a cloud storage server – the best one to use in my experience is DropBox. In the App there is an export/import function to share survey forms between devices. This also means that you will need to collate all results from different devices at the end.
As noted above, the feed2go app needs to be downloaded on each iPad. In our case all our service iPads are registered to one email address. This means we can use the one subscription across all of our devices. This is not the case if iPads are registered to different email addresses – a subscription will need to be paid for each.
Overall, yes the experience of using the App could be improved a little. But, the main feature we wanted it for – to save the results and successfully upload them worked 100%. I think what distinguishes feed2go from the previously (unsuccessfully) used software was that it operated through a web browser which relied on a cache of temp internet files files. Feed2go is an app which stores the data securely in a folder in the same way the camera stores photos on the iPad. Finally, the FAQ on the feed2go and the email support for the App is great; the developer is really responsive.
We have now used the App to conduct surveys in the estate around Our Blaise Castle House Museum site and we are planning to replace paper exit surveys at our houses (where we don’t have wifi) with the offline App.
If you have any comments or questions about doing offlien surveys or surveys in the cultural sector please get in touch I’m happy to have a chat. firstname.lastname@example.org
Having a visual representation of upcoming exhibitions, works, and major events is important in the exhibition planning process. Rather than relying on spotting dates that clash using lists of data, having a horizontal timeline spread out visually allows for faster cross-checking and helps collaboratively decide on how to plan for exhibition installs and derigs.
Until recently we had a system that used excel to plan out this timeline, by merging cells and coloring horizontally it was possible to manually construct a timeline. Apart from the pure joy that comes from printing anything from Excel, there were a number of limitations of this method.
When dates changed the whole thing needed to be rejigged
Everyone who received a printed copy at meetings stuck that to the wall and so date changes were hard to communicate.
We need to see the timeline over different scales – short term and long term, so this means using 2 separate excel tabs for each, hence duplication of effort.
We were unable to apply any permissions
The data was not interoperable with other systems
TIMELINE SOFTWARE (vis.js)
Thanks to Almende B.V. there is an open source timeline code library available at visjs.org/docs/timeline so this offers a neat solution to the manual task of having to recast the timeline using some creative Excel skills each time. We already have a database of Exhibition dates following our digital signage project and so this was the perfect opportunity to reuse this data, which should be the most up to date version of planned events as it is what we display to the public internally in our venues.
The digital timeline was implemented using MEAN stack technology and combines data feeds from a variety of sources. In addition to bringing in data for agreed exhibitions, we wanted a flexible way to add installations, derigs, and other notes and so a new database on the node server combines these dates with exhibitions data. We can assign permissions to different user groups using some open source authentication libraries and this means we can now release the timeline for staff not involved in exhibitions, but also let various teams add and edit their own specific timeline data.
The great thing about vis is the ease of manipulation of the timeline, users are able to zoom in and out, and backward and forwards in time using with mouse, arrow or touch/pinch gestures.
The management of information surrounding object conservation, loans and movements is fundamental to successful exhibition development and installation. As such we maintain a record of exhibition dates in EMu, our collections management software. The EMu events module is used to record when exhibitions take place and also the object list where curators select and deselect objects for exhibition. Using the EMU API we are able to extract a structured list of Exhibitions information for publishing to the digital timeline.
HOW OUR TIMELINE WORKS
Each gallery or public space has its own horizontal track where exhibitions are published as blocks. These are grouped into our 5 museums and archives buildings and can be selected/deselected from the timeline to cross reference each. Once logged in a user is able ot manually add new blocks to the timeline and these are pre-set to “install”, “derig” and “provisional date”. Once a block is added our exhibitions team are able to add notes that are accessible on clicking the block. It is also possible to reorder and adjust dates by clicking and dragging.
The timeline now means everyone has access to an up to date picture of upcoming exhibitons installations to no one is out of date. The timeline is on a public platform and is mobile accessible so staff can access it on the move, in galleries or at home. Less time is spent on creative Excel manipulation and more work on spotting errors. It has also made scheduling meetings more dynamic, allowing better cross referencing and moving to different positions in time. An unexpected effect is that we are spotting more uses for the solution and are currently investigating the use of it for booking rooms and resources. There are some really neat things we can do such as import a data feed from the timeline back into our MS Outlook calendars (“oooooh!”). The addition of thumbnail pictures used to advertise exhibitions has been a favorite feature among staff and really helps give an instant impression of current events, since it reinforces the exhibition branding which people are already familiar with.
It is far from perfect! Several iterations were needed to develop the drag and drop feature fo adding events. Also, we are reaching diminishing returns in terms of performance – with more and more data available to plot, the web app is performing slowly and could do with further optimisation to improve speed. Also due to our IT infrastructure, many staff use Internet Explorer and whilst the timeline works OK, many features are broken on this browser without changes to compatibility and caching settings on IE.
Hopefully optimisation will improve performance and then it is full steam ahead with developing our resource booking system using the same framework.