Category Archives: Digital skills

How to get rid of VGA after 30 years!

Here at the M Shed in Bristol, we have amazing views of the harbor from our lovely events suit. Here we hold all sorts of events from large annual AGMs for corporations’, to weddings and some really great community events.

 

We have a fully automated integrated audio visual system. With AMX and Creston control systems, you can walk around the function rooms holding a smart, touch screen control panel and control just about everything! You can power up the projectors, lower the screens, open and shut the blinds, control volumes, select what to display from Sky TV, Blu-ray players and laptops, you can even change the lighting to any colour scheme you want.

 

It’s all pretty smart. Pretty smart apart from the dreaded Video Graphics Array as the main interface, more commonly referred to as the VGA connector! For all this advanced technology, presenters still have to connect their devices with a cable.
The VGA standard was invented in 1987 by IBM, and its dreaded 15 pin D Sub connector still to this day refuses to go away.
Until now…

 
There’s something amiss when a presenter asks to use their nice, brand new iPad to run their presentation and you then have to use a lighting port to VGA adapter connected to 10 meter VGA cable. These VGA connectors were designed for permanent installation and so when they are swapped between laptop and other devices several times a day, the 15 tiny pins take a battering and it only takes one bent pin for the screen to go pink, blue or stop all together.

Here comes the ingenious solution to take advantage of the wireless / Wi-Fi capabilities that are now standard for all devices.

The idea and solution comes in the form of finding a combination of ready available, off the shelf technology combined in such a way it allows the transmission of a device’s screen to appear on our projection system, without any wires. We needed this to be augmented into our current system without affecting its current capabilities. It is already a great intergraded AV system, it’s just needs to be brought into the future without losing its ability to use the old VGA system. It may be old but it works so well as a last resort and backup.

Apple products long ago ditched the VGA system in favour of min-display ports or “lighting ports”. A quick trip to any Apple store and an assistant will enthusiastically show how with a flick of the devices, a display can be “thrown” to another screen. It’s called Air Play and is Apple’s secure version of Wi-Fi streaming.

Google, with their ever innovative developments, have developed a technology called Chrome Cast to the same effect, which is also based on Wi-Fi streaming.

With delegates at our events bringing Apple products, PCs and android devices, we needed an all in one system; so purchased these products to enable this streaming. I ordered an Apple TV and a Chrome Cast device which both work by connecting to a Wi-Fi network and looking for compatible devices. Both of these provide a solution for all devices. Chrome Cast is much cheaper than Apple TV and can support Apple products too, but the ease of use and reliability of Apple on Apple products seemed worth the extra investment. I calculated the cost of replacement VGA cables and at the current rate we replace them, these new items would pay for themselves in just three years!

The main issue I faced in integrating these was how to patch them into a fully automated, closed AV system without affecting its capabilities. In essence, how to “retrofit” an Apple TV and Chrome Cast and get the systems to talk over M Shed’s Wi-Fi – a public network, effectively part of the councils IT network and heavily locked down.
To solve the first issue, I had to literally climb into the AV racking system to find a suitable part that interfaced with an HDMI connector (both Chrome Cast and Apple TV use HDMI). I chose our SKY TV box and unplugged its HDMI cable. Onto this cable I place a HDMI switcher, which allows 4 inputs to connect as one. The switcher is the sort of device you would buy if your TV at home only has one HDMI port and you had multiple devices you wanted to connect: a DVD player, games console and a Freeview box. I then connected the Sky box to the switcher along with the Apple TV and Chrome Cast unit. Then after finding power outlets, whilst still inside the AV systems rack, I carefully slid the switcher unit so its control switch faced out the front of the rack. A few cables ties and some Velcro later and the hardware was installed, all that was left to do was to climb out and check it all worked.

Going back to the Creston AV touch panel, I selected Sky TV and sure enough it appeared on the projections screen as it should. Then by using the controls on the switcher unit I was able to toggle between Sky, Apple TV and Chrome Cast.
It then occurred to me that both the Apple and Chrome devices use the HDMI to output their audio too. However the HDMI feeds to the projector which only projects the image, so audio would be lost. Climbing back into the AV rack, I noticed that the Sky box was using analog RCA connectors to output its audio to integrated ceiling speaker system. Fortunately the switcher also had 3.5mm TRS output (headphone socket), so by setting the Sky box to output audio through its HMDI it meant that all three devices were now feeding the audio and visual signal to the switcher. Then by using the RCA connector from the Sky box with the TRS adapter, all three devices were now feeding to the ceiling speaker system. I climbed back out of the rack and started to create a new, independent Wi-Fi network for devices to communicate.

 

 

 

 

 

 

 

 

 

 

The new Wi-Fi network was actually the simpler part.
I purchased an ASUS RT-AC3200 Tri-Band Giga-Bit Wi-Fi router. This router is enormous with six aerials and looks like the Batmobile. I figured that it would have to be reliable and be able to cope with large amount of data traffic, so I got the most powerful but cost effective router I could find.

The idea behind the router was to have all the devices (Apple TV, Chrome Cast and whichever device is streaming) all on the same network, a network I could manage. Once on the same network, it was a matter of connecting. The Apple system was really straight forward- you join the same Wi-Fi network as the Apple TV (I named the network “presentations”) then chose the Airplay option on the device and as easy as that the screen is mirrored on the projector. The Chrome set up was a little more involved. With an android device, you have to install an app called Chrome Cast. Once installed it’s quite straight forward to pair with the Chrome Cast receiver and then the screen can be mirrored on the projector. With a Windows PC laptop, I had to install the latest version of Chrome. This then comes with the option to cast either just the browser tab you’re using or the whole desktop -this works well but compared to the Apple TV there is a slight lag. In some instances you would have to install the Chrome Cast extension for Chrome.

I also connected the Wi-Fi router to our open Wi-Fi system with a RJ45 cable. This then allowed people on the Presentation Wi-Fi to still be able to access the net.
We are still trialing the system before we start to officially offer it as part of a package, but so far so good. It has been received very positively from users. We’ve had people walking around with iPads – controlling their presentation and not being tied to the lectern with an old pc. We’ve even had the best man at a wedding wirelessly control the music playlist from his iPhone at the top table! PCs are still being used at the lectern as normal but without the need to trail VGA cable everywhere. The only thing left to work out is wireless power… I suppose batteries will have to do for now.

How to make two 120FT cranes talk to each other

Here at M Shed Bristol, we have some great working exhibits from the bygone era of Bristol Harbour’s industrial past: steam engines, steam boats, steam cranes and more. But the most recognisable and iconic are the four great towering electric cranes standing over 120 feet above the old docks.

As the Industrial Museum was being transformed into the present day M Shed Museum two of the cranes would strike up conversations with each other, entertaining and informing passers-by of what they could look forward to seeing inside the new museum. However due to renovations and movement of the
cranes they fell silent again…

A few years later, due to popular demand I was tasked with bringing the cranes back to life!

To get these cranes talking was going to require rebuilding the whole audio and lighting system and recording new scripts. We were fortunate enough to have Alex Rankin, from our M Shed team, lend his penning abilities for the new scripts and Jacqui and Heather to voice the new crane characters.

To record the dialogue, we arranged to meet in a nice quite corner of the L Shed store room. It’s a vast store, full of so many objects that there isn’t enough space to have them on permanent display. With both Jacqui and Heather sat at opposite ends of a table, I set up a pair of good quality condenser microphones. Each plugged into their own separate channel on my external sound card, an Akai EIE 4 channel usb sound card with great preamps and phantom powered for the mics. This in turn was hooked up to my MacBook and copy of Logic Pro. I recorded through each script a few times and was able to compile a seamless recording from the various takes. Once finished, I hard panned each channel left and right so that when each voice played back each would have its own speaker, left or right – crane 1 or crane 2.

To start building the new AV system, I searched around the vast L-Shed stores and work rooms to find what was left of the old system. I then decided what could be re used and what new equipment would be needed. I had been informed, by our volunteer team for the working exhibits, that everything had been removed from the cranes themselves; this meant starting from scratch.

The cranes themselves would need a loud speaker system for the voices and the crane cabs would need different coloured lights to flash in time with the talking as this helps to animate the cranes. That part was relatively easy. It meant scaling the cranes and bolting speakers to their underside and mounting lamps inside the cabs. I’ll be honest, I was helped by the Volunteer team and a huge mobile diesel powered cherry picker!

 

The hard part was how to feed the power and audio cables to the cranes. After some investigation it turned out that below the surface of the dockside was a network of underground pipes which lead to the base of each crane to feed their power. The great volunteer team once again worked miracles and fed over 600 combined meters of audio and lighting cables for me. This all led back to the clean room in their ground floor workshop. With all the cabling done I just needed to build a lighting control and audio playback system.

 

 

My design solution, using what kit I could find and a few new bits, was to use a solid state compact flash media player, graphic equaliser, audio mixing desk and power amplifier for the audio.  To have the light flash in time with the dialogue, I used a two light controller with a light to sound module, similar to what a DJ might use to have their disco lights flash to the music!

By having the audio go through the mixing desk, I was able to take an audio feed for each channel and direct them to lighting controllers. By recording the two voices in stereo, with each voice on its own left or right channel, it meant i only needed one media player and could easily control each channel on the sound desk. The graphic equaliser allowed me to tweak the speakers to acoustically fit their environment.

I looked at randomising the audio or having it triggered by people walking past, but with the amount of people who pass outside M Shed the cranes would be chatting away, non-stop all day! I decided to create a long audio file of about 3 hours with the different recorded scripts and random intervals of silence. These ranged from 5 minutes to 20 minutes, so it always comes as a surprise when they start talking to each other.

The results are really effective. It is always fun to see people being caught by surprise as the cranes light up and start a conversation and to see them stop and listen in on what they have to say.

 

 

How we did it: automating the retail order forms using Shopify.

*explicit content warning* this post makes reference to APIs.

THE PROBLEM:  Having set ourselves the challenge of improving the buying process  , our task in Team Digital was to figure out where we can do things more efficiently and smartly. Thanks to our implementation of Shopify, we have no shortage of data on sales to help with this, however the process of gathering the required information to place an order of more stock is time consuming – retail staff need to manually copy and paste machine-like product codes, look up supplier details and compile fresh order forms each time, all the while attention is taken away from what really matters, i.e. which products are currently selling, and which are not.

In a nutshell, the problem can be addressed by creating a specific view of our shop data – one that combines the cost of goods, with the inventory quantity (amount of stock left) in a way that factors in a specific period of time and which can be combined with supplier information so we know who to order each top selling product from, without having to look anything up. We were keen to get in to the world of Shopify development and thanks to the handy Shopify developer programme documentation & API help it was fairly painless to get a prototype up and running.

SETTING UP: We first had to understand the difference between public and private apps with Shopify.  A private app lets you hard code it to speak to a specific shop, whereas the public apps need to be able to authenticate on the fly to any shop. With this we felt a private app was the way to go, at least until we know it works!

Following this and armed with the various passwords and keys needed to programmatically interact with our store, the next step was to find a way to develop a query to give us the data we need, and then to automate the process  and present it in a meaningful way. By default Shopify provides its data as JSON, which is nice, if you are a computer.

TECHNICAL DETAILS: We set up a cron job on an AWS virtual machine running Node and MongoDB. Using the MEAN stack framework and some open source libraries to integrate with Google Sheets, and notably to handle asynchronous processes in a tidy way. If you’d like to explore the code – that’s all here. In addition to scheduled tasks we also built an AngularJS web client which allows staff to run reports manually and to change some settings.

Which translates as: In order to process the data automatically, we needed a database and computer setup that would allow us to talk to Shopify and Google Docs, and to run at a set time each day without human intervention.

The way that Shopify works means we couldn’t develop a single query to do the job in one go as you might in SQL (traditional database language). Also, there are limitations in how many times you can query the store. What emerged from our testing was a series of steps, and an algorithm which did multiple data extractions and recombination’s, which I’ll attempt to describe here. P.S. do shout if there is an easier way to do this ;).

STEP 1: Get a list of all products in the store. We’ll need these to know which supplier each product comes from, and the product types might help in further analysis.

STEP 2: Combine results of step one with the cost of goods. This information lives in a separate app and needs to be imported from a csv file. We’ll need this when we come to build our supplier order form.

STEP 3: Get a list of all orders within a certain period. This bit is the crucial factor in understanding what is currently selling. Whilst we do this, we’ll add in the data from the steps above so we can generate a table with all the information we need to make an order.

STEP 4: Count how many sales of each product type have taken place. This converts our list of individual transactions into a list of products with a count of sales. This uses the MongoDB aggregation pipeline and is what turns our raw data into something more meaningful. It looks a bit like this, (just so you know):

STEP 5: Add the data to a Google Sheet. What luck there is some open source code which we can use to hook our Shopify data up to Google. There are a few steps needed in order for the Google sheet to talk to our data – we basically have our server act as a Google user and share editing access with him, or her?. And while we are beginning to personify this system, we are calling it ‘Stockify’, the latest member of Team Digital, however Zak prefers the lofty moniker Dave.

The result is a table of top selling products in the last x number of days, with x being a variable we can control. The whole process takes quite a few minutes, especially if x >60, and this is due to limitations with each integration – you can only add a new line to a Google sheet once / second, and there are over 500 lines. The great thing about our app is that he/she doesn’t mind working at night or early in the morning, and on weekends or at other times when retail managers probably shouldn’t be looking at sales stats, but probably are. With Stockify/Dave scheduled for 7am each morning we know that when staff look at the data to do the ordering it will be an up to date assessment of the last 60 days’ worth of sales.

We now have the following columns in our Google Sheet, some have come directly from their corresponding Shopify table, whereas some have been calculated on the fly to give us a unique view of our data and on we can gain new insights from.

  • product_type: (from the product table)
  • variant_i:d (one product can have many variants)
  • price: (from the product table)
  • cost_of_goods: (imported from a csv)
  • order_cost: (cost_of_goods * amount sold)
  • sales_value: (price * amount sold)
  • name: (from the product table)
  • amount sold: (transaction table compared to product table / time)
  • inventory_quantity: (from the product table)
  • order_status: (if inventory_quantity < amount sold /time)
  • barcode: (from the product table)
  • sku: (from the product table)
  • vendor: (from the product table)
  • date_report_ru:n (so we know if the scheduled task failed)

TEST, ITERATE, REFINE:  For the first few iterations we failed it on some basic sense checking – not enough data was coming through. This turned out to be because we were running queries faster than the Shopify API would supply the data and transactions were missing. We fixed this with some loopy code, and now we are in the process of tweaking the period of time we wish to analyse – too short and we miss some important items, for example if a popular book hasn’t sold in the last x days, this might not be picked up in the sales report. Also – we need to factor in things like half term, Christmas and other festivals such as Chinese New Year, which Stockify/Dave can’t predict. Yet.

AUTOMATIC ORDER FORMS: To help staff compile the order form we used our latest Google-sheet-fu using  a combination of pick lists, named ranges and the query function to lookup all products tagged with a status of “Re-order”

A list of suppliers appears on the order form template:

and then this formula looks up the products for the chosen supplier and populates the order table:

“=QUERY(indirect(“last_60_days”&”!”&”11:685″),”select G where M='”&$B2&”‘ and J=’re-order'”)”

The trick is  for our app to check if the quantity sold in the last x days is less than the inventory quantity, in which case it goes on the order form.

NEXT STEPS: Oh we’re not done yet! with each step into automation we take, another possibility appears on the horizon…here’s some questions we’ll be asking our system in the coming weeks..

  • -How many products have not sold in the last x days?
  • -If the product type is books, can we order more if the inventory quantity goes below a certain threshold?
  • Even if a particular product has not sold in the last 60 days, can we flag this product type anyway so it gets added to our automatic order form?
  • While we are at it, do we need to look up supplier email addresses each time – cant we just have them appear by magic.

…furthermore we need to integrate this data with our CRM…..looks like we will be busy for a while longer.

 

 

 

Update from the Bristol University development team:

Since October we have been working with Computer Science students from the University of Bristol to redesign the interface for our digital asset management system.

After initially outlining what we want from the new design, there have been frequent meetings and they’ve now reached a stage where they can happily share with us their project so far.
Quick, appealing and easy to use, this potential new interface looks very promising!

The week our office turned into a photography studio!

With one of the main aims this year being to improve our online shop, myself and Darren decided to improve and update some stock photos. We enrolled in a crash course from resident photographer David Emeney and by the end of session thought we’d be able to do it, easy.

However, we came to find that photography is not as easy as it seems! First came the issue of space. Although David kindly allowed us to use his studio in the basement, with no computer nearby to check pictures and in fear of messing with any of his equipment, we though it may be best  if we set up a studio a little more close by.IMG_20170109_154103292_HDR

In true Blue Peter style Darren and I set about creating our own in-office photography studio by collecting bits and pieces from around the museum to mirror the one in the basement. Cardboard tubes were stapled together acting as a rod to hold the white background in place, this was held up by string wrapped multiple times around our window bars, counter tops were cleaned as to not make the paper dirty and even a staff noticeboard was used behind the paper to block out any natural light. Of course our office had to be rearranged first to fit such a project inside, a move which would have me non-stop sneezing for a few days as the settled dust was disturbed!

After a while of playing with the camera’s settings trying to find the right ones, we set to work to photograph stock. With thanks to Debs for letting us borrow geology’s light, the products came out well and the online shop now looks a lot smarter for it. Having this type of light was key to taking a good image, the close proximity between the product and source of light and changing the camera’s white balance when needed added extra quality.

It was a really good experience getting to know the manual settings of a camera and how each product requires a slight adjustment, also to be up to date with what products we currently have in store. I look forward to doing more stock photo shoots in the future and hope, at some point, to have all products photographed like this to keep a consistent look for the online shop.

£70 stoneSSGB edt 4559warrior duck 4857

My Digital Apprenticeship with Bristol Museums

My name is Lacey Trotman and I am currently in the fifth week of my Digital Apprenticeship with Bristol Museums. Having left college this June completing a 2 year A levels course in History, Art History, Sociology and Film Studies, the summer was spent searching for the right role for me. Despite College pushing for students to attend University – and many of my friends doing so, I felt the pressures of study and exams to degree level were not for me at this time. I chose instead to look at apprenticeships as it gave me a chance to put my skills into practical use in a real world setting.

Since starting on October 4th I have already begun to work on various projects broadening my range of skills and understanding: tackling the Discovery Pens, writing ‘How to’ guides, resizing images, composing surveys, working on the online shop, diving into the fast paced world of social media and editing blogs for the Museum website.

My first impression is that it’s an amazing place to work, with many opportunities to
undertake and progress.  It’s also clear to see that there is a lot of work going into such an institution with many more departments behind the scenes than I could possibly have imagined.

I have always loved visiting museums and galleries. As a proud Bristolian I feel Bristol Museums provide some of the best in the country. Growing up, family holidays were full of excurst-michaels-mountsions to castles and places of historical interest. Most recently, we visited St Michael’s Mount in Cornwall. Our seaside cottage faced the historic site making for picturesque views at all times. With Poldark loving parentbarbara-hepworths we also visited the historic mines and ruins of work houses on the Cornish coast. Cornwall was also the home to legendary artist Barbara Hepworth, one of my key artists to feature in the Art History exam I completed this year; so I was thrilled
to see an original piece by her on our day trip to St. Ives.  Even better is that a few weeks after starting this apprenticeship, Winged Figure was newly installed in the gallery confirming this is definitely the best place to work!

Throughout my childhood I visited all the venues that come under the Bristol Museums canopy. My first trip to The Red Lodge Museum, was with Primary School. I remember being asked by the staff if I wanted to dress as Queen Elizabeth I for the class picture, but afraid of the spotlight I volunteered my best friend instead! Blaise Castle was always a childhood favourite of mine and I can also remember visiting the old Industrial Museum with its variety of transport, planes and trucks. However I banksywas delighted when the new M Shed opened offering fun and interactive features for free. I have not yet gotten over missing the iconic Banksy vs Bristol Museum exhibition or Dismaland, just 40 minutes away in Weston. With such strong links to Bristol, Banksy is a favourite artist of mine. Recently he paid my old Primary School a visit leaving a large mural on their classroom wall.

The next two years fill me with excitement and expectation. The addition of a marketing qualification will add further qualifications to my growing C.V.  I hope to excel in my role growing in both confidence and ability; I am keen to ensure I make the most of this experience and hope that all I have to offer will been seen as a positive addition to the hardworking Digital Team.

Digital Curating Internship – an update

By David Wright (Digital Curating Intern, Bristol Culture)

Both Macauley Bridgman and I are now into week six of our internship as Digital Curating Assistants here at Bristol Culture (Bristol Museums) . At this stage we have partaken in a wide array of projects which have provided us with invaluable experiences as History and Heritage students (a discipline that combines the study if history with its digital interpretation) at the University of the West of England. We have now been on several different tours of the museum both front of house and behind the scenes. Most notably our store tour with Head of Collections Ray Barnett, which provided us with knowledge of issues facing curators nationwide such as conservation techniques, museum pests and the different methods of both utilisation and presentation of objects within the entirety of the museum’s collection.

pic from stores

In addition we were also invited to a presentation by the International Training Programme in which Bristol Museums is a partner alongside the British Museum. Presentations given by Ntombovuyo Tywakadi, Collections Assistant at Ditsong Museum (South Africa), followed by Wanghuan Shi, Project Co-ordinator at Art Exhibitions China and Ana Sverko, Research Associate at the Institute of Art History (Croatia). All three visitors discussed their roles within their respective institutions and provided us with a unique insight into curating around the world. We both found these presentations both insightful and thought provoking as we entered Q&A centred on restrictions and limitations of historical presentation in different nations.

Alongside these experiences we have also assumed multiple projects for various departments around the museum as part of our cross disciplinary approach to digital curating.

Our first project involved working with Natural Sciences Collections Officer Bonnie Griffin to photograph, catalogue and conserve Natural History specimens in the store. This was a privileged assignment which we have perhaps found the most enjoyable. The first hand curating experience and intimate access with both highly experienced staff and noteworthy artefacts we both found inspiring in relation to our respective future careers.

David Wright
David Wright – Digital Curating Intern

Following on from this we undertook a project assigned by Lisa Graves, Curator for World Cultures, to digitise the outdated card index system for India. The digital outcome of this will hopefully see use in an exhibition next year to celebrate the seventieth anniversary of Indian independence in a UK-India Year of Culture. At times we found this work to be somewhat tedious and frustrating however upon completion we have come to recognise the immense significance of digitising museum records for both the preservation of information for future generations and the increased potential such records provide for future utilisation and accessibility.

We have now fully immersed ourselves into our main Bristol Parks project which aims to explore processes by which the museum’s collections can be recorded and presented through geo-location technology. For the purposes of this project we have limited our exploration to well-known local parks, namely Clifton and Durdham Downs with the aim of creating a comprehensive catalogue of records that have been geo-referenced to precise sites within the area. With the proliferation of online mapping tools this is an important time for the museum to analyse how it records object provenance, and having mappable collections makes them suitable for inclusion in a variety of new and exciting platforms – watch this space!. Inclusive of this we have established standardised procedures for object georeferencing which can then be replicated for the use of future ventures and areas. Our previous projects for other departments have provided the foundation for us to explore and critically analyse contemporary processes and experiment with new ways to create links between objects within the museum’s collections.

id cards

As the saying goes “time flies when you are having fun”, and this is certainly true for our experience up to date. We are now in our final two weeks here at the museum and our focus is now fervently on completing our Bristol Parks project.

Google Drive for Publishing to Digital Signage

Having taken an agile development approach to our digital screen technology, it has been interesting as the various elements emerge based on our current needs. Lately there has been the need for quick ways to push posters and images to the screens for private events and one-off occasions.

Due to the complexity of the various modes, and the intricacies of events-based data and automatic scheduling it has been difficult incorporating these needs into the system. Our solution was to use Google Drive as a means to override the screens with temporary content. This means our staff can manage content for private events using tables and mobile devices, and watch the updates push through in real time.

The pathway of routes now looks like this

Untitled Diagram (1)

HOW?

There are two main elements to the override process – firstly, we are using BackboneJS as the application framework because this provides a routing structure that controls the various signage modes. We added a new route at the beginning of the process to check for content added to Google Drive – if there is no content the signs follow their normal modes of operation.

Google Drive Integration

Google provide a nice way to publish web services, hidden amongst the scripts editor inside Google sheets. We created a script that loops through a Drive directory and publishes a list of contents as JSON –  you can see the result of that script here. By making the directory public, any images we load into the drive are picked up by the script. The screens then check the script for new content regularly. The good thing about this is that we can add content to specially named folders – if the folder names match either the venue or the specific machine name – all targeted screens will start showing that content.

Google drive integration

It seems that this form of web hosting will be deprecated in Google Drive at the end of August 2016. But the links we are using to get the image might still work. If not we can find a workaround – possibly by listing urls to content hosted elsewhere in the Google sheet and looking that up.

The main benefits of this solution are being able to override the normal mode of operation using Google Drive on a mobile device. This even works with video – we added some more overrides so the poster mode doesn’t loop till the next slide until after the video has finished – video brings in several issues when considering timings for digital signage. One problem with hosting via Google Drive is that files over 25MB don’t work due to Google’s antivirus checking warning which prevents the files being released.

We’ll wait to see if this new functionality gets used – and if it is reliable after August 2016. In fact – this mode might be usable on its own to manage other screens around the various venues which until now were not up datable. If successful it will vastly reduce the need to run around with memory sticks before private events – and hopefully let us spend more time generating the wonderful content that the technology is designed to publish for our visitors.

You can download the latest release and try it for yourself here.

 

 

 

 

Getting an archival tree-view to sort properly online

The digital team at Bristol Culture face new challenges every day, and with diverse collections come a diverse range of problems when it comes to publishing online. One particularly taxing issue we encountered recently was how to represent and navigate through an archives collection appropriately on the web.

Here’s what Jayne Pucknell, an archivist at the Bristol Record Office, has to say:

“To an archivist, individual items such as photographs are important but it is critical that we are able to see them within their context. When we catalogue a collection, we try to group records into series to reflect their provenance, and the original order in which they were created. These series or groups are displayed as a hierarchical ‘tree view’ which shows that arrangement.”

So far so good – we needed to display this tree-view online, and it just so happens there is a useful open source jquery plugin to help us achieve that, called jsTree.

Capture

The problem we found when we implemented this online, was that the tree view did not display the archive records in the correct order. The default sort was the order in which the records had been created, and although we were able to apply a sort to the records in our source database (EMu), we were unable to find a satisfactory sorting method that returned a numerical sort for the records based on their archival reference number. This is because the archival reference number is made up from a series of sub-numbers reflecting sub collections.

So this gave us a challenge to fix, and the opportunity to fix it was possible because of the EMu API and programming  in between the source database and collections online.  The trick was to write a php function that could reorder the archive tree before it was displayed.

Well, we did that and here’s a breakdown of what that function does:

The function takes 2 arguments – the archival number as a text string, and the level in the archive as an integer.

1.) split the reference number into an its subnumbers
2.) construct a new array from the subnumbers
3.) perform a special sort on the new array that takes into account each subnumber in turn

in theory that’s it – but looking at the code in hindsight there are a whole heap of complexities that would take longer to articulate here than just to past in the code, so lets make it open source and leave you to delve if you wish – here’s the code on Github

Another subtle complexity in this work is described further by Jayne:

“You may search and find an individual photograph and its catalogue entry will explain the specific content of that image, but to understand its wider context it is helpful to be able to consider the collection as a whole. Or you may search and find one photograph of interest but then want to explore other items which came in with that photograph. By displaying the hierarchy, you are more easily able to navigate your way through the whole collection.”

Because of the way our collections online record pages are built – a record does not immediately contain links to all its parents or children. This is problematic when building the archives tree as ideally we wish each node to link to the parent or child depicted. We therefore needed a way to get the link for each related record whilst constructing the tree. Luckily we maintain the tree structure in EMu via the parent field.

The solution was to query the parent field and get the children of that parent, then loop through each child record and add a node to the tree. This process could be repeated up the parents until a record with no parents was reached and this would then become the root node. Because the html markup was the same for each node, this process could be written as a set of functions:

1.) has_parent: take a record number and perfom a  search to see if it has a parent, if it does return the parent id.

2.) return_children: take a record number, search for its child records and return them as an array

2.) child_html: take an array of child records and construct the links for each in html

Taking advice from Jonathan Ainsworth from the University of Leeds Special Collections, who went through similar issues when building their online pages, we decided not to perform this recursively due to the chance of entering an infinite loop or incurring too much processing time. Instead I decided to call the functions for a set number of levels in the tree – this works as we did not expect more than seven levels. The thing to point out is that when you land on a particular record, the hierarchical level could be anything, but the programmed function to build the tree remains the same.

Here’s the result – using some css and the customisable features in jsTree we can indicate which is the selected record by highlighting. We also had to play around with the jsTree settings to enable the selected record to appear, by expanding each of its parent nodes in turn – to be honest it all got a bit loopy!

Capture

….here’s the link to this record on our Collections Online.

Hope this is of use to anyone going through similar issues – on the face of it the problem is a simple one, but as we are coming to learn in team digital – nothing is really ever just simple.

 

 

 

A week in the Bristol Museums digital team

rachel-and-darrenHello! My name’s Rachel and I’m a Heritage Lottery Fund Skills for the Future graduate trainee. I am usually based in Worcester as part of the Worcestershire’s Treasures project, with my traineeship focused on audience development and events. As part of the traineeship I’m able to do a week’s secondary placement at another museum or heritage venue, and this week I joined the Bristol Museums digital team to get an insight into what they do, and generally learn some new stuff. I got in touch with Zak and Fay as I knew I wanted to spend my week elsewhere learning more about museums and digital. I had seen both of them speak at conferences – Zak at the Museums Association’s annual conference in Cardiff, and Fay at Culture 24’s Digital Change: Seizing The Opportunity Online in Birmingham – and thought Bristol seemed like the place to be for museums and digital!

I’ve been involved with some really interesting and useful things since the start of the week. On Monday I did some content management on the development site in preparation for user testing later on in the week. On Tuesday I sat in on a meeting with fffunction, and then joined the museum’s new digital marketing intern, Olivia, in creating some content for social media. As the Shaun the Sheep trail started this week, we had fun coming up with some awful sheep-related puns – keep an eye out for these on @bristolmuseum! pirate_shaunOn Wednesday I visited The Georgian House Museum and The Red Lodge Museum, conducted some visitor surveys down at M Shed, and then yesterday I sat in on some user testing sessions with teachers, for the new learning pages of the website. They were given a number of scenarios to work through and it was really fascinating to see how users interact with the site and the different ways people navigate through it.

Some of the other useful things I’ve been introduced to this week are the organisation’s Audience Development Strategic Plan and their social media guidelines, and how data collected from users is collated and reported.  I also sat in on a meeting with some of the team involved with the upcoming exhibition death: the human experience to discuss the digital engagement to go alongside the physical exhibition and programme. This is just one example of the collaborative nature of the digital offer, and it came across to me that it is viewed as an integrative part of the exhibition, as opposed to just an add-on, which is really positive.

It’s also been great seeing how a different museum works. The museum I work at is quite different, in terms of size, staffing, collection and audience, and so coming to a large local authority museums service with seven physical sites has been a valuable experience in itself.

Overall I have had a brilliant week, I think it’s been a good overview of the team’s work, with lots of variety and things to get involved with. I have felt really welcome and included, and everyone at the museum has been so friendly. Thanks so much to the team for hosting me this week, and especially to Fay for letting me follow her round for most of it. My traineeship comes to an end shortly, so hopefully you’ll see me on a digital team soon!