What do 5 years of Library Reviews tell us about library management?

We have conducted 13 comprehensive library reviews in the past 5 years. Now for the first time, and I mean that because I haven’t even looked at the data until now, it’s time to see which elements of library management and service delivery are typically performed well, and which ones seem to be most challenging and problematic.

Even with a bit of a lull during the COVID years we are engaged by two to four Councils every year to conduct a comprehensive review of their library service. These are not reviews with a pre-determined intent to cut library services or staffing. The aim is generally to understand what the library is doing well in serving its community, how efficiently it is operating and what it could do better.

Our approach uses the framework defined in the APLA/ALIA Standards and Guidelines for Australian Public Libraries, December 2020 (the Guidelines … https://read.alia.org.au/apla-alia-standards-and-guidelines-australian-public-libraries-may-2021).

The guidelines have 14 key elements, examining library service management, offering and delivery within the context of the local community setting. When we first did one of these reviews back in 2017 we gave each element a single rating but that didn’t acknowledge that management of library collections (for example) could be going well in some areas and not so well in others. Now we give a 4-part colour-coded rating to illustrate performance against each of the 14 elements. So a review report might look something like this.

The 12 reviews we have completed since 2018 have been done in four different states for libraries in metropolitan, regional and rural areas with populations ranging from 20,000 to 500,000 people. Which means that we have a real mix of different libraries in different settings facing different challenges. So if we pool all of these ratings what do we learn about contemporary library management?

First, let’s look at the overall review scores (and for ease of presentation I have converted these to scores out of 100). What we’ve found is that most libraries reviewed ended up with an overall score somewhere between 64% and 72%, which when I went to school meant a solid C to a low B. Only one of the libraries got a mark above 80% (an A grade), there was one B+ and only one fell into the D-range. No one failed.

I don’t think this is a result of us sub-consciously normalising the scores. I think we treat each library service independently and judge them fairly against the APLA/ALIA Guidelines. Jacqui and I each separately come up with our own assessment against each element and then sit down and (politely) argue out a final rating. However, I do think this outcome reflects the fact that most public libraries (as with most organisations) do some things really well and don’t perform so highly in other areas, which (when you aggregate scores across 14 elements) leads to averaging that ends up with a Bell-curve centred around a 70% mark. (Oooh – aggregation of data produces a Bell-curve. Stop the presses!)

Second, let’s look at the performance against different elements of the Guidelines (again adjusted to scores out of 10). And this is where it gets interesting because there a few scores in here that I didn’t expect.

  • The two areas where all libraries consistently perform highly are under the Service Offering heading – Information and Reference Services (9.4) and Technology Access (8.8). None of the libraries we have reviewed have had a red mark against Information and Reference Services, and there is only one red against Technology Access.

  • Funding got an average score of 7.4. There was a fair bit of orange where we saw room for improvement, but only two Councils got called out for significantly below par funding of library services. Both are now working to address this shortfall. It’s also fair to say that if a Council employs us to do a full review of their library service they are probably pre-disposed to support the library as a community asset.

  • The next four elements I am going to lump together as they all have about 60% of green-ratings, 20% orange and 20% red, with average overall scores from 6.7 to 7.1 out of 10. They are: Governance; Management; Programs and Customer Service.

    • ­What we have tended to find here is that most library services sit reasonably well within the overall operations of Council and are well-managed at a local level. BUT there is often one niggling thing that’s not quite right and this comes through as an orange and/or red rating. Failure to be recongised in Council strategies and plans, difficulties getting the website updated when it’s managed by the IT Department, social media being controlled by Marketing and Communications. You know the story.

    • ­Library programming and customer service are a bit the same – well-done by most libraries most of the time. BUT again there is room for improvement. Expanded adult and youth programming, increased focus on social inclusion activities, getting out from behind the desk to connect with customers, and eliminating those few fake smiles and ‘Hello’s that are completely transparent to the customer.

  • The next three elements stand out because they have 60-70% of orange ratings – adequate, not good, and almost never poor. That is: Partnerships and Collaboration; Strategic Community Focus; Individual and Community Outcomes.

    • ­Partnerships is one of those areas where every library has a few solid connections with some groups or sectors in Council and the community (e.g. kindergartens, schools), but there is significant untapped potential. Invariably this comes down to resourcing and having staff time to commit to identifying, establishing and nurturing productive partnerships. It’s difficult, but it’s one of the most effective ways of strengthening wide community engagement.

    • ­Which is why it’s no surprise to see partnerships lumped with the two community-focused elements of the framework. Ultimately, if libraries want to maximise their impact on community outcomes they need to make sure that up-front they have a real, meaningful and deep understanding of and connection to their unique community. Periodic demographic analysis, regular community surveys, feedback forums and evaluations – if you don’t ask can you be really sure that you know your community. Almost every library review produces one green and three oranges on community focus. The one library that I’ve seen do this really well had usage and customer satisfaction levels that were consistently well above benchmarks.

  • Sitting with average scores of 6.0 out of 10 and almost identical distribution of colour-ratings are Content and Collections and Places and Spaces. There’s no logical reason why they should appear similar but they do (45% green, 35% orange, 20% red).

    • ­We consistently find that libraries manage their collections pretty well (which you’d hope would be the situation), but there’s almost always a red flag too. The collection is too large or too old (stop hoarding), the LMS is out of date, there’s no local history collection, eResources are light on, high humidity is causing damp. I sometimes wonder if the state of the collection is one of those signs to the untrained management eye in Council that maybe it’s time for a library review.

    • ­Places and spaces go the same way. Mostly OK most of the time but the children’s area is dull and uninviting, or there is one branch that hasn’t seen fresh paint in 40 years, the layout is all wrong and no-one has ever realised this or the website sucks. It can be hard to get the right mix of quiet and active spaces, single and group spaces, especially in small libraries, but there is no excuse for dull dark library spaces.

  • I am surprised that Staffing came out so low (average score of 5.6), but again there is a lot of orange there. That suggests that we find staffing levels, structures and portfolios often require a little tweaking to get things running more smoothly (maybe extra resources in programming, partnerships and community engagement). And professional development is (sad to say) almost always an area that is a little underdone and tends to be concentrated on upper level staffing.

  • Finally Service Points (average 5.4). Another one where most reviewed libraries walked away with one red marker against the service network. Too many or too few library branches, legacy locations and infrastructure, opening hours that don’t align with the profile of the community (evenings, weekends). The curious thing here is that the ratings were mostly green or red, good or poor. This suggests that this is one area where it’s hard to turn around a problem quickly. You know you’ve got an issue but fixing it requires some significant investment, and that might take a little while to come through.

So what does all of that tell us?

  1. There are a lot of moving parts to running a good library service. And you need to be across all of them all of the time (which is why having standards and guidelines is a good thing).

  2. Most libraries do most things pretty well most of the time. But by the same token, most libraries have a few things that they don’t do so well. The benefit of reviewing performance on a regular basis is that you know where the levels are right and what needs to improve.

  3. The areas where most libraries are likely to find their sticking points are in the service network, places and spaces, collections and staffing.

  4. All libraries could benefit from adopting a strategic focus on their local community. And to do this – you must know your community well.

Previous
Previous

eBorrowing settles at 25%

Next
Next

The imperfect art of library benchmarking