Halloween Games for Library Programs

A few months ago an old colleague of mine asked me about games for her library to integrate into a Halloween program. After some back and forth I recommended five games. These games were chosen to fit in her (slight) budget, not require too steep learning curve, not have too complicated a setup, accommodate families, be resilient to repeated use (and survive a piece or two getting lost along the way) and yet satisfy a desire for contemporary games. The list that resulted from so many requirements is more a work of compromises than a platonic ideal. But, I think it works.

So, first up, games that she could get for ~ $100 and re-use for later programs. Without a firm cost limit I went with three commercial and two non-commercial options. The three box games have an MSRP of about $125 plus a few additional costs but with online discounts or partnering with a local retailer you can get down much closer to $100, or just drop one of the boxed games off the list.

It’s far too late for most libraries to plan for programs for this Halloween but I figured it would be easier to clean up the list I made for her now then remember to write it up in six months. When that time comes I do intend to update this list for next year’s planning.  This list isn’t going to be a revelation to those who follow table top games already but hopefully it can be a jumping off point for library game programs that are looking for a quick entry point. I recommend setting up and playing all of these once or twice before you do them with a program. For the boxed games, setup can be one of the most intimidating parts so don’t expect patrons to be involved with that though volunteers who have learned the rules in advance can be great to sit and play with the patrons which is why I focused on more cooperative games. Plus, cooperative games are great at creating a bonding experience, win or lose. Most of these have tutorials on YouTube you can use to help save time if you find the rules books cryptic.

Betrayal at the House on the HillBetrayl at House on the Hill

Premise: A group of people wander into this creepy house and based on a random event one discovers they are a bad guy and tries to kill the others.

Halloween Themes:  This game was an auto include due to it dripping with classic horror movie themes – the creepy house, werewolves, ghosts, demons, and so on. And it’s fun, just plain fun.

Anything Objectionable: I’m presuming that if you’re doing Halloween events that you’re prepared for any objections to references to the supernatural. Nonetheless, although not vulgar, there are some end game scenarios that references things like demons more strongly than others. Know your community as always. There is implied gore but it’s not not strongly presented.

Target Audience: Tweens and up.

MSRP: $40 and there is a lot of value in the box for that

Library Tie Ins: Nearly any horror writer or film is an indirect connection but some more than others, especially the “in the creepy house overnight” trope.

Game Style: This is a tile laying co-operative game with a defector. The characters and lots of options make it a kind of role playing game in a box. The laying of house tiles as you explore creates randomness as does card draws and dice rolling. This is very much an American style game that places style over strategy. A game mechanic causes a random member of your party to be revealed as the traitor (up until then that player is also unaware of it) and then select one of many possible end games scenarios. How the survivors and defector are supposed to handle that is a secret to each side detailed in separate booklets.

Accessibility: Most of the symbols are easily visible and clearly laid out. Text is decently sized and easy to read. The math and logic of the game is pretty easy for various cognitive levels. Some visual disabilities will struggle with text and part recognition but not much.

Learning the Game: At first glance the game can appear complicated however it’s simpler than it may seem so don’t let it scare you.

Patron Interaction: I’m not usually a big fan of defector style games but if there was a premise it was appropriate for, this is it. It is still mostly a cooperative game and one where even losing can be fun.

Overall Opinion:  If there was a perfect game for Halloween this would be it. Even at $50 I would have recommended it but at $40 it’s one of the cheapest modern board games with lots of replay you can get.

WerewolfWerewolf

Different published versions are called things like Werewolf By Night and Ultimate Werewolf, sometimes with variant roles and rules.

Premise: With day and night cycles werewolves work in silence to kill villagers and during the day villagers try to find the werewolves in their midst.

Halloween Themes: Werewolves hunting villages in the night. It would only be more Halloween if it had candy corn in it and someone somewhere probably has rules for that.

Anything Objectionable: Some groups like to play up the heavy drama and gore of it but really the tone is set by the players so for a library program, keep it light.

Target Audience: All. Because it’s very rules lite this is the broadest of games. Even little kids can play this game.

Cost: Variable ~ $10 / Free

Library Tie Ins: Anything with werewolves. Except Twilight. Because vampire as don’t sparkle! The werewolves were decent in it though. They’re guilty by association however so no Twilight.

Game Style: This is a party game and as such it’s cornerstone is social interaction and ability to handle large groups. The focus is on having fun through social interaction with minimal preparation and effort on the part of someone organizing it. That combined with free sounds like the perfect library program doesn’t it?

Accessibility: This is a perfect ten in terms of accessibility. All you have to be able to do is listen to a description of roles and see one card and in a pinch you can even be told what that is. A small amount of visual ability is needed but I’ve played this with the legally (but not completely) blind and deaf with no issues.

Learning the Game: Less than five minutes.

Patron Interaction: This game is all about interaction. The key is to not take it too seriously. When villagers are eliminated I like to give them things to do in the background like becoming a greek chorus of howling wolves. Keeping the eliminated players involved is the biggest challenge.

More Information: Werewolf is a game that goes by a number of names with decks printed by different people that you can usually buy for around $10. It was invented and has been played for free for decades now with various hacks. I’ve played it on the fly with nothing more than scraps of paper and hastily written ‘villager’ or ‘werewolf’ written on them. I’m going to give you a link to a page by a fellow named Max and this page lists history, some great information about the game and a free to print version of some nice cards with graphics if you want them:

http://maxistentialism.com/werewolf/

Next ….

Elder SignElder Sign

Premise: Using dice to represent the actions of heroic investigators the players try to stop cultists and monsters from bringing the elder gods back into the world.

Halloween Themes: Other worldly monsters, cultists, Elder Gods.

Anything Objectionable: Stay away from implying that the elder gods are somehow reflective of a real religion. Bring in the themes of the mythology as literature.

Target Audience: Tweens and up.

MSRP: $35

Library Tie Ins: Cthulhu Mythos stories from Lovecraft to the approximately 17,492 self published Cthulhu ebooks on Amazon.

Game Style: Cooperative dice game.

Accessibility: The main meat of the game is using dice to resolve challenges on cards and while there is descriptive text the dice and cards have easy to recognize symbols. The logic behind the dice prioritization is fairly easy to grok. The midnight doom cards are probably the biggest challenge for someone with limited vision.

Learning the Game: The game has a few awkward rules so it’s worth reading through the rule book a few times. It also has a mobile app that’s great for learning the game. Elder Sign has the most “gotcha” rules of any on this list.

Patron Interaction: As a cooperative game it encourages discussion and coordinated play.

Summation: I felt a moral obligation to include a Lovecraft mythos game in the list with it’s pop culture popularity. Elder Sign isn’t my favorite, I actually like Eldritch Horror more because I like it’s RPG elements but it has a higher price point and for diversity there is an audience that loves dice games. And a lot of people do love Elder Sign but if you have a bit more money available in your budget and your audience doesn’t include those who love dice games, I would consider Eldritch Horror instead.

 

MysteriumMysterium

Premise: 19th century psychics band together to try to get visions from a ghost and solve a murder mystery. Only one of the players knows who did what but they play the ghost and can’t talk to the other players.

Halloween Themes: Ghost Stories

Anything Objectionable: If spiritualism offends people this may be an issue but I assume that in that case Halloween may be an unhappy time for them in general.

Target Audience: Tweens and up.

MSRP: $50

Library Tie Ins: Ghost stories and murder mysteries. Think The Lovely Bones.

Game Style: Cooperative but a single information isolated player.

Accessibility: Like other games Mysterium has done a good job of using large clearly recognizable symbols but some colors may be issues for some kinds of color blindness.

Learning the Game: Mysterium is a fairly straight forward game. Most of the complicated elements are in the setup stage. I definitely recommend playing through two or three games before doing it with patrons.

Patron Interaction: A fully cooperative game with communication challenges can make this a great social experience. It’s like cooperative Clue with visions from ghosts. It’s awesome.

Soundtrack: The company that created Mysterium has a sound track available for download. Many of these games would benefit from a good soundtrack of course.

http://www.libellud.com/actualites/mysterium-decouvrez-la-bande-son

Dread RPGDread

Premise: An RPG with a wooden block tower instead of dice, leading to an increasingly feeling of dread as the tower gets less stable. When you cause it to fall, you’re doomed.

Halloween Themes: Whatever you make them.

Anything Objectionable: Only if you create trouble for yourself. I know there are still anti-RPG people out there who think RPGs are tied to Satanism but fortunately most of those are obsessed with D&D and their numbers have dwindled to somewhere less than moon landing deniers and more than flat earthers. Remember, while causing the tower to fall over is described as how a character dies you can change this to something less violent to just remove them from the story.

Target Audience: The default audience of Dread is more mature but thanks to it being an RPG you can modify the style of the stories and your presentation to make it any age appropriate and change the tone from high … well, dread, to whatever you want.

Cost: Free / $12 / $24 + s/h | + ~$10 for the “tower of dread”, The link below is where you’d an get the free PDF of the rules, buy a full PDF with scenarios and more information for $12 or order a $24 copy of the book.  Free is good.

About Dread the Game

Mechanics: One important thing to get out of the way related to cost is the “tower of dread.” Unlike games that use dice or cards for randomness Dread uses a tower of wooden blocks that a Jenga tower happens to work perfectly for (though there are non-Jenga trademarked ones that work also). There are chances you might already have one around for other library programs so that may or may not be a cost. One nice thing is that it lowers the barrier of entry in terms of teaching people and pretty much guarantees people stay in the story until the tower starts getting depleted.

Library Tie Ins: Whatever you make them.

Game Style: This is an RPG. However, Dread channels you to a storytelling heavy environment as it has very few mechanics. I encourage RPG storytellers to really involve the players and throw scenarios back to them with opportunities like “how do you want this to resolve” and then you can let them pull from the tower if necessary.

Accessibility: There might be some reading to do but with someone very visually impaired you could do away with character sheets. The bigger problem for someone with motor impairments or very low vision will be pulling from the tower.

Learning the Game: You can get everything you need to know in four pages of light reading and teach it to others in about thirty seconds.

Patron Interaction: The good and bad of an RPG is that there isn’t a mechanical structure constraining them. In a library program I would put a disclaimer in the setup that there is a certain social contract and for this purpose they are cooperating.

Staying Under $100, aka Dropping a Game

If one game is to be dropped based on cost I recommend Mysterium. It does have a lot of cards to keep track of and has one of the higher price points. However, it is crazy stylistic and fun. However, if the goal is to keep the highest quality games and one needs to be dropped I recommend dropping Elder Sign but you may enjoy dice games more than I do.

Honorable Mention – King of Tokyo

King of Tokyo is a great kaiju themed game that lends itself to silliness and fun tossing dice around. It’s very accessible and has few barriers to entry including logical planning that children can do. It’s very thematic. It’s an American style open information board game so older or more experienced players can help others and a wide variety of kinds of players can easily play together. It even has a Halloween specific expansion. Unfortunately, it also needs the Power Up expansion to really be complete and recently a second edition of the game came out with less cartoony graphics and without the Power Up expansion (yet). If you can get the first edition with the expansion it’s great for library programs. There is also a variant called King of New York but it includes unnecessary additional rules that I think hurt it for teaching in a programming setting (plus the monsters aren’t as iconic so not quite as cool).  If you’re willing to play without the Power Up Expansion this is definitely one to consider.

Further Playing 

There are so many Halloween appropriate games it’s impossible to list them all.

If you like tile laying and classic horror movie vibes the Castle Ravenloft Board Game is very cool. It has a ton of setup though and is priced at about $65. I think the rules are also awkward at times so be prepared to play it a few times and improvise occasionally. Dead of Winter is a great survival zombie game with a defector that can also be played fully cooperative. A Touch of Evil is a competitive game with some wonky rules but thematically perfect for Halloween and with some rule hacking is good. I’ve not played either yet but Fury of Dracula and Letters From Whitechapel are both defector games that look good too and are well reviewed in the board game community. As both are sitting on my shelf downstairs they are ones I’m likely to add to a future list.

Just on the Cthulhian game front: I previously noted I like Eldritch Horror for an RPG in a box. Mansions of Madness 2nd edition is a very cool game but pricey. Arkham Horror provides a good big box experience but it pricey and time consuming with a huge number of parts. Unspeakable Words is a good Cthulhian word game. I can think of six more Cthulhu card games that are decent and more RPGs than that. Honestly that could be a post in it’s own right, good Cthulhian games for library programs.

A Partial History of SCLENDS

A few weeks ago Equinox Software published a blog post I wrote about Evergreen in 2009. My first draft and my final draft were very different. Draft by draft I stripped out the history of how SCLENDS started, not because I didn’t want to tell it but because in the larger Evergreen context it wasn’t what I wanted to say. The very fact that some remained though and that I did start with so much tells me something. It is a story I want to tell and while that post wasn’t the place, this is. Why? Honestly during that first year we did a lot of “make it work and fix it later.” Document? If there’s time. It’s easy to be critical of that approach but we had tight deadlines and if it hadn’t been done the way it was it might never have happened. But now I have a little time to write it and want to do so while my memory is clear, at least of the elements that stand out in 2009.

I’m not going to claim this is a complete history. Beyond the fallibility of memory I doubt I know the whole story and it’s naturally biased towards the events I was present for. SCLENDS was started by many people, library directors, circ managers, systems librarians and more. I worked with most of them but some only tangentially. No single person was present for every conversation and no person could know the whole story. And since I’ve admitted that this will be an incomplete telling I will also offer that I’m going to try to keep it brief. The story begins properly with the development of writing in ancient Mesopotamia and Egypt … just kidding.

In 2008 I was the Systems Librarian in Florence County, South Carolina. The library’s director, Ray McBride, and I had been deeply involved in the process of re-evaluating our technology plan. One thing we were not concerned about was our ILS. We were very happy Horizon users and had assumed that we would upgrade to Horizon 8 when it was released. It had already been delayed but why would we consider other options? Going out for an RFP is a process to be avoided like an invasive unnecessary medical procedure. Plus, we were happy with Horizon, it was user friendly, it fit our needs and was stable. Sure, it had gotten a little long in the tooth but the upgrade would give it the refresh it needed.

Then one day I was reading through my daily mail and there was a correspondence from Sirsi-Dynix. Horizon 8, Rome, was being canceled. Instead they would take the modern code base of their other product and merge it with the user friendliness of Horizon and like tunes being played together it would be Symphony. It was the kind of over the top marketing speak that made it clear they were trying to make users feel positive about news they knew we would be unhappy with. They would have been right about the unhappy part.

Fast forward and we had a meeting. I had compiled a list of possible ILSes we could upgrade to. Polaris was a strong contender. We seriously looked at Symphony, hoping for the potential of an easy migration. There were others we dismissed due to expense or lack of features. There might have been another we considered that I can’t remember now. And I threw Evergreen onto the stack for consideration.

Why did I suggest Evergreen? Florence was an almost pure Windows server environment and this was a radical departure. I didn’t try to convert the Florence environment to Linux despite my preferences because with the staff limitations the library had and applications they had invested in running within a Windows environment, Microsoft made sense. Migrating to a mission critical application on Linux was a big departure. But, when I looked at the growth of open source, what I saw happening in the Evergreen community and my own opinions about the relationship between open source and library philosophies I was of the conviction that we should consider it. Not go to it, just consider it. Frankly, with my time limitations an easy upgrade to Symphony sounded pretty good to me.

We formed a committee of public service staff and administrators. We invited in representatives from companies to talk about their ILSes. Evergreen was open source so I distributed a fact sheet. We had reps from Polaris and SirsiDynix come in. We talked to other libraries. One library referred to recent updates to Symphony in …. unflattering terms and told us they were migrating to Polaris as soon as they could. Others were only slightly kinder. Polaris looked good but didn’t blow us away. A Sirsi representative made it clear that migrating to Symphony would not be like an upgrade and there was Horizon functionality that did not have one for one parity in Symphony.

Discussions were lively but in the end we selected an ILS: Evergreen. At that point Evergreen was about version 1.2 and rough. As we talked about it one theme came up again and again. We believed that whatever shortcomings Evergreen had at that point in mid 2008 that it was the right long term choice for us. We believed that in time it would match and exceed the other options we had to pick from. We also wanted a choice that we felt would last us ten years. I think it was Ray who said later that this would be the last ILS a library would ever need to migrate to. He may well be proven right, only time will tell.

It can be strange what you remember. It was a Thursday afternoon in November that I was having coffee with Ray. We were discussing Evergreen and forming our plans for the migration. One of my concerns was the long term support, especially if I left. We began discussing approaching an external company for support of our servers. That would give me more time to spend in the community and support regardless of staff turnover. As we looked we also began to discuss moving to remote hosting and increasingly liked the idea though it meant moving nearly all technical management external to the library, not something we had traditionally done. However, while we had put a lot of value on internal staff management of technology we also had increasing needs without an increasing budget so going with a remote hosting option made sense.

All of this, especially the budget concerns, was in my head when I threw out another idea. In one sense, this was the start of SCLENDS. What if we invited others a to join us to start a consortium and reduce costs? Ray liked the idea and threw the idea out to the South Carolina library director’s listserv. From there I become a peripheral part of the story until January. During that time in the periphery I was aware that the offer was expressed and interest returned. I was tasked with inviting a vendor who could run servers for us.  The clear option was Equinox, having been founded by the original developers and administrators of Evergreen at Georgia PINES.  Additionally, they had a lot of experience with startup consortiums so they would understand what we were embarking on.

December passed and January of 2009 arrived. I found myself in the large meeting room at the Florence Library. The interested libraries were arriving. Eleven libraries in total attended that meeting, interested in sharing costs and materials in a new consortium. That meeting brought together not only the directors but systems administrators and circulation managers of the libraries.

Eleven libraries were present and ten of them went on to form SCLENDS. Honestly, that day was a blur of faces and voices. One person whose name I don’t hear mentioned much in connection to SCLENDS is Catherine Buck-Morgan and it should be. Although I don’t know this for fact I suspect she is the one who created the name (had it been left to me I probably would have chosen something tree related). Additionally, she was a critical part of this happening. It may have happened without her involvement, it may not have, I don’t know. I do know it wouldn’t have happened as quickly and the way that it did.

Catherine was the head of IT at the State Library and closely involved with the distribution of LSTA money in the state. I later discovered that she had already written a concept paper for creating a resource sharing consortium in South Carolina. I don’t believe her idea was inherently based on open source but she did cite PINES as an example of what she was thinking of in terms of resource sharing. Her idea hadn’t been circulated outside the State Library but this had dovetailed with it perfectly. She was critical to getting us LSTA funding to kickstart the migrations.

SCLENDS would quickly move over to a self sufficient model independent of LSTA and State Library money but those funds paid for the first two years of hosting and many of the migration expenses over two fiscal years that included our first three waves of libraries. Partial funds also helped one later wave.

Honestly, I thought the idea would be a much tougher sell than it was. Eleven libraries attended that first meeting and I had imagined half would back out. In the end only one, Greenville County, chose not to join SCLENDS, objecting to sharing their videos with other libraries. Most of these discussions happened in January and early February. Then we got to work. In less than five months, driven in large part by a window of opportunity for grant monies, we went from a first meeting to go live.

Wave one went live in late May 2009 and consisted of the State Library itself, the Union County Library and Beaufort County Library System. I later went to the State Library myself for a tenure at the IT Director there where I ironically ended up working with the Union County director, Nancy Rosenwald. We had both taken positions there and had offices next to each other. I really enjoyed working with her both within SCLENDS and at the State Library. She also had good taste in tea. Beaufort had one of the most dramatic go live days when a construction crew cut their fiber line during the first day of go live. The story the local newspaper printed was essentially “Evergreen Fails” instead of “No Internet at Library.” I understand they later printed a retraction in small print in an obscure text box. Ray McBride after a stint as a museum director even took over the library system there proving that it is a very small world. I discovered that Beaufort had been investigating Evergreen in 2008 as well though not as far along nor with plans as definite as our’s in Florence.

Wave 2 was in October of 2009 and included Fairfield County, Dorchester County, Chesterfield County and Calhoun County. Frank Bruno of Dorchester I think I fought with as much as I agreed with. I remember his staff loved him because he supported them. He passed away last year and the world is poorer for losing him. Drusilla Carter left Chesterfield for Virginia where she helped start talks that may have led to their own Evergreen consortium and eventually landed in Conneticut where she is a part of Bibliomation, another Evergreen consortium. Kristen Simensen is still at the Calhoun County library and fighting the good fight. Sarah McMaster of Fairfield retired right around the same time I left South Carolina and her last SCLENDS meeting was, I believe, my last one as well. Aside from personally liking Sarah as a person, professionally, there isn’t a library in the country that would not benefit from having a copy of Sarah on staff.

Finally wave three went live in December and included my own library Florence. Shasta Brewer of the York County library became a close co-worker of mine over those months and became the leader of the early cataloging discussions. Faith Line of Anderson had pervious consortium start up experience and continued to long be a voice that people looked to leadership on the executive board. I believe it was Faith her that suggested the creation of the working groups to aid in the migration that eventually became the main functional staff bodies of the consortium. Even when there were later attempts to expand or redefine them the original ones persisted in being the main ones. In Florence, Ray served as the chair man of the board during the infancy of the consortium and after leaving came back to another SCLENDS library.

And there were others – other staff, other stories and later other libraries which brought yet more staff and stories. SCLENDS grew over the next few years. But those stories belong in other years. I may or may not write about those stories some day but I think they’re better documented so there is probably little need. Did I leave some things out? Sure. The Thanksgiving Day Massacre. The Networked Man Incident. The Impossible Script Mystery. Probably others as well, and they make for fun stories, but aren’t core to the history I think.

– Rogan

post

Sound and Fury: Choosing an ILS

I published this article a few years ago in Computers in Libraries.  Nothing in it will be revelatory for most open source advocates but at the time I got a lot of feedback from librarians that it was useful.  CiL’s exclusive publication window is long since past and I just thought of it the other day so I thought I would re-publish it here.  Four years later I still think it’s spot on though I would probably make some changes to either shorten it more or make it longer with practical examples.  


Few decisions cause a library director to fret more than choosing a new integrated library system (ILS).  A new ILS is expensive in money, staff, time and stress no matter what you acquire.  Additionally, the wrong choice can bear costs in morale with lasting consequences.  Sometimes it is easy to identify which ILS is wrong for you – the contract costs are too high or maybe the features are not present that you need.  But, too often selecting the right one is like going to a car dealership where everyone speaks in tongues and the price lists are encrypted.

This is the result of a decade of market disruption.  Once upon a time proprietary ILS vendors were not optional.  Picking the right ILS was fraught with danger but not conceptually difficult.  Two changes in the market have had an enormous impact.  One of these, the growth of applications as services has added new options to the ILS selection process.  However, it has been the growth of open source ILSes, such as Koha and Evergreen that have made it necessary to rethink the selection process. 

Choosing between an open source and a proprietary solution is not a choice between peaches and pineapples.  Frequently, it is assumed that the two types of ILSes cannot be evaluated by the same criteria.  In fact, they can be.  Although they result from radically different economic models and divergent philosophies, in the end both are products and services that can be defined by a library’s needs and resources for the purposes of acquiring.  Four major criteria must be compared – product cost, features, communities and support.  Until open source disrupted the ILS market one could safely ignore communities.  The community of a proprietary ILS product might have added value but it was unlikely to either make or break the selection of an ILS.  Now community plays a much more important role but that will come after we look at the other criteria. 

Perhaps the first thing to dispel is the myth that open source should be discussed as the cheap option. The wise library administrator will realize that while many of the best things in life are free, your ILS isn’t going to be one of them.  Your cost won’t always be in legal currency.  I have met staff so traumatized by a bad migration that they are still visibly shaken years later by what is now a stable and reliable tool.  The library has paid an ongoing price in terms of post traumatic stress and that cost can be too high.  The best migration is pointless if the library’s experience falls apart within a year or two causing the whole thing to happen again – an experience I’ve seen happen with both open source and proprietary.

Each of the four criteria can have multiple metrics as well as multiple vectors to plot them on for both a migration and on going support.  In the end a data set for ILS selection should probably look more like a scatter chart than a report card. Now, can we simplify the process?  The answer is yes.  A detailed consideration would be worthy of it’s own book but by taking a few conservative shortcuts we can sketch a road map for your selection process in a period of time that isn’t comparable to earning another master’s degree.  For example, we will assume that you have the same vendor handle a migration as ongoing support.  This will not be a road map for the adventurous.  This is for those whose boards will require that all core functions work on the day of go live with minimal surprises.  And being conservative does not exclude you from an open source solution.

First, do a needs assessment.  This is the point at which many upgrade processes fail.  Rather than say something like “we need acquisitions” or “we need EDI” do use cases and narratives.  These should create an unambiguous picture of your needs.  Be careful to not attempt to recreate your existing ILS.  This is the point at which libraries realize how deeply embedded their current ILS is in their operations.  The documents you produce at this stage will be used extensively in working with vendors.  Be honest about what you need and what is merely on a wish list.

Now, find your vendors.  Don’t even worry about the ILS itself yet.  That may sound heretical in an ILS selection process but you need to safeguard from fixating on a single product and not evaluating honestly.  Fixate on your needs instead. Some vendors will support multiple ILSes and at this stage you are looking at who can provide support during a migration and ongoing.  Look at each vendor’s ability to support hardware, provide reliable access, and expertise with the ILS, their ability to find solutions, their training resources and expertise at setting a system up.  Do yourself a favor and look in depth at their experience with data migration – it is surprisingly hard to do well.  Do not let a vendor make vague promises about your data.    Looking at vendors before solutions may seem to be putting the cart before the horse but in the long run the greatest frustration most libraries have with an ILS doesn’t stem from software but support.  At this point you should also rank yourself as a vendor to see if you want to fill some of these support roles yourself.  Be honest about your ability to sustain support.  Many libraries begin projects that falter when key personnel leave because the skills are not part of the institution.

Many open source advocates argue that support is an inherent advantage of open source.  Some libraries delay leaving an ILS they are unsatisfied with the support of due to the stresses of migration.  If you use a vendor for support of an open source ILS they cannot lock you into the ILS itself.   Once your contract is up, if you leave a support vendor and extract your data you can import rather than migrate your data into a new system.  That ease of changing support vendors without changing software means that open source support companies have to compete on the basis of support because the threshold of difficulty for the library leaving is reduced by orders of magnitude.

The next step is to define what kind of support contract you want.  Do you want a local install with minimal support, do you want local with remote administration or perhaps an application as service where you sign a check and everything is just made to happen.  At this point evaluating yourself, as a potential vendor, will help you determine if you want to exclude yourself.  A product supported fully by a reputable vendor with skilled support staff is what you’re looking for.  Increasingly, the choice most libraries make is buying an application as service.  Vendors can take advantage of high capacity Internet connections and big virtualization systems to achieve economies of scale and offer remote hosted ILS services much cheaper than a library can locally offer it.  But, you may have factors, such as response times needed, which make a local installation more attractive.  Knowing what kind of support contract you need you can begin looking at the packages offered by the vendors and dramatically simplify the rest of the process.

Next, make two lists to look at support and features separately.  Vendors need their feet put to the fire to answer if they can fill your needs, which is why the use cases and narratives are critical.  Find out what the vendors’ uptime guarantees are, what their response times are and what tiers of support they offer.  For example, do they handle user interface level troubleshooting, will they do custom development to solve issues, or do they simply do systems administration?  What services do they offer during the migration?  Can they extract your old data?  Can they offer project management or training?  Will they offer documentation?  Now, some of these resources may originate in part or whole from a community but at this point worry about the availability through the vendor and their obligation to you to make it happen.  List the support levels of each vendor.  Go back to Buying An ILS 101, call references and do every other thing you would do with any big-ticket purchase.  

Parallel to support, review the ILSes themselves and isolate what software will be viable for you.  Make sure the vendors support those features and how you want to use them.  I’ve had clients spend a lot of time preparing to move to a system only to have their vendor say, “we don’t support serials” even though the ILS has the functionality.  Return to those use cases and narratives your staff developed earlier.  While sharing the use cases get a detailed analysis of what your narrative experience using the ILS(s) will be.  If they can build a comparative scaled system (number of patron, bib, copy records, etc..) for you to test against this is critical for applications as services.  More than one library has been burned by not seeing their data run at scale and not doing hands on features testing.

Think about future features too.  Will there be things you can’t anticipate or live without?  Will the new social network that everyone gushes over be critical two years from now?  Can the company you are working with provide you with development options?  If not, then open source may provide you with other kinds of development paths depending upon the community surrounding it.  What about wish list features?  Maybe like William Henley you want to be the captain of your own soul, or at least ILS.  Ask yourself if you want to make changes to the software and control those changes in the future.  If the answer is a firm yes, then you probably want an open source ILS and will need to allocate resources for development.  Don’t automatically discount a proprietary vendor but giving you that control is not usually a part of their business model.

It is also worth asking if you want to be part of a consortium.   Although really large resource sharing consortiums aren’t unique to open source they do seem to be more common with growths in the Evergreen community, like SCLENDS.  Materials sharing may or may not be on your agenda but adding to an existing installation has a lot of advantages including a built in local community to draw on.  

Since applications as services are delivered over Internet connections it is important to know the impact they will have on your connection.  Prolonged profiling will tell you when you may have interruptions in service and what delays in response time you may have.  Map obscure phrases like “ping times” and “drop rates” to real measurements like “it will take 2 seconds to check out an item.”  Often time, work flows can be adjusted to handle the increased latency from moving an ILS from inside your network to remote hosting but an unexpected impact like that can heavily damage morale.  This is a time to bring in heavy-duty network expertise and make sure they go over issues with a fine tooth comb.

Finally, we get to every library’s least favorite topic that isn’t protected by confidentiality laws: budgets.  Take those support options and the ILSes by the vendors you find acceptable and map them against how much you have to spend.  Any that you can’t afford, toss.  What you’re left with is ILSes that will work for you, companies you can trust to support you and an experience you can afford.  Be wary of rushing into support contracts for applications as services though.  Compare your costs across the lifespan of the longest contract you would have to sign – which should be three years.  A longer contract than that which locks you in should be a concern.  Make sure there are guarantees about maximum rate increases and reasonable rates for extracting data. 

Many an ILS has been chosen because the library administrator feels overwhelmed.  An implementation by the current ILS vendor can seem like an easy and safe choice. That is a poor assumption to make.  In the course of development or corporate acquisitions sometimes the upgrade path defined by a vendor is actually to a whole new product.  When that happens an upgrade is really a migration.  So, donÕt be deceived by the potential level of difficulty of the project.  Vendors like to define upgrade paths because they know many local governments provide clauses that allow organizations to upgrade without going through a competitive bid process.  ThatÕs also how you can get stuck doing two migrations in two years Ð something no one wants to do.

At this point you may be ready to select an ILS but you should take one more step.  So far we have flattened out the modern twists to ILS selection and used a model built on common sense.  The next criterion is not a leap into uncommon sense but it is much harder to define.  Evaluating community requires the administrator to understand how their staff as professionals will interact with a larger community rather than perform workflows.  Community is not an open source specific criteria though it might be more central to those ILSes.  The communities of proprietary ILSes can be hampered or facilitated by the corporation linked to the ILS.  Open source ILSes are built by their communities but may still have large corporate presences.  When evaluating those dynamics don’t frame the discussion as business versus community, as that’s a false comparison.  Evaluate the businesses as members of the community by their actions and consider that when developing a picture of the whole community.  

As you investigate vendors, how they interact with communities might tell you something about the character of the company.  Does it allow for independent user groups and conferences?  Are there emails lists and public forums?  Are there places to share and ask questions of others who use the ILS?  Some proprietary ILS vendors have encouraged these things and allowed outside repositories for documents.  In open source communities these are the norm.  Do not underestimate the value of community.  Not only do ILS communities help you make the most of one of the largest pieces of your infrastructure but also an active engaged community can be invaluable for the professional development of your staff.

Look at your resources and ask if you are the kind of organization that is ready to be part of a larger community or if you prefer to play alone on your own ball field.  Sometimes it is the larger libraries that are less prepared to be strong community members because they are accustomed to making decisions as an independent entity.   In the end you may choose to leave community out of your considerations for an ILS selection. However, at least some awareness of the larger community should always be there, to compare experiences with a vendor at the very least.  If you are using an open source ILS vendor and you successfully vetted them they should be involved in the community and may be a gateway to you becoming involved in the future if your priorities change.

At this point you have decided on the viability of a given ILS migration and looked at communities as added value.  Do you need a tiebreaker query?  If you do look at what your gut tells you.  The truth is that for all of our development as a species we still sometimes process information subconsciously and have gut instincts that lead us well. Do you have a philosophical leaning towards open source?  Does one vendor click as a partner? 
 
If you are willing to endure some hardships you can play fast and loose with this process.  Risks can pay off but it’s a luxury most boards don’t give their directors.  Open source succeeds where it is the best solution, not because of philosophical biases just as commercial software succeeds when it does so on quality, not on spreading fear, uncertainty and doubt about competition. As we look critically at these solutions, their vendors and communities we also have to look to the future.  Mark Twain said a sure way to look a fool is to try to predict the future.  But we need a sense of how the future of these ILSes will unfold since we will be tied to one once we make that selection.  Communities and companies can be filled with amazing people who can make all the difference, and they can fall apart.  Engagement with partners in companies and communities are where we will the future unfolding.  We need to remain aware of these dynamics – they are often why we end up moving to a new ILS after all.

Sound and Fury

Well, I got home from a road trip to find my comp copies of the July/August Computers in Libraries waiting for me and some emails!  I sat down to re-read it because frankly I wrote it long enough that I don’t remember much of what I wrote.  

http://www.infotoday.com/cilmag/jul13/index.shtml

The article is about open source, including Evergreen, and selecting an ILS.  A few bit things:

1) They gave it a nice attractive spread.  That’s vanity on my part but I like it. 

Front spread of the article.  

Front spread of the article.  

2) I’m still happy with my opening paragraph.  “Few decisions cause a library director to fret more than choosing a new integrated library system (ILS).  No matter what you acquire, a new ILS is expensive in terms of money, staff, time and stress.  Additionally, the wrong choice can damage morale and having lasting consequences.  Sometimes it is easy to identify which ILS is wrong for you – the contract costs are too high or maybe the features that you need aren’t present.  But, too often, selecting the right one is like going to a car dealership where everyone speaks in tongues and the price lists are encrypted.”

3) They re-used an old bio bit for me from my days working at the State Library which is wrong.  I’m at the York County Library System now.

Now, for the email I got and my response:  

From Greg, full name withheld to protect the guilty 🙂  :

I just received my copy of the publication “Computers In Libraries”, July/August 2013. I thought your article “Sound and Fury” was an excellent guide for libraries considering a migration of their library systems, but I was a bit surprised that you cited “LibLime Koha and Evergreen” as examples of open source ILSs. I rather suspect that many open source people would regard LibLime Koha as open source only by the letter of the law, and not by spirit or community. Evergreen is indeed an excellent example of open source software, but I wonder if it suffers by its apparent close association in this context with LibLime Koha.

Koha (!= LibLime Koha) is a much more openly developed and community supported example of an open source application than the LibLime fork. Your article deals very well with the subject of selecting vendors; the paid-support page for Koha (

http://koha-community.org/support/paid-support

) lists 37 vendors world-wide (if my quick count is correct and deducting two entries for PTFS). I’m under the impression that only PTFS supports LibLime Koha, but perhaps there are others. Many of the listed Koha service providers provide hosted application (ASP) solutions as you mentioned in your article. 

A quick count of my Koha mailing list messages for July 24-31 shows 86 entries (sorry, I got tired of counting after going backwards for one week), that probably extrapolates to about 350 messages per month. I don’t follow the free support for LibLime, but I’ve been told that it’s more questions than meaningful answers. Link of possible interest: https://listserv.nd.edu/cgi-bin/wa?A2=ind1308&L=web4lib&D=0&P=15401

Code contributions to the Koha development process are encouraged, with contributions and downloads available on a git code management system, and packages are available for Debian-based operating systems. Koha also has an IRC channel where developers discuss issues, and where users can <mostly> ask questions and get answers to problems they are experiencing. I’m not aware that LibLime Koha is as openly developed or freely supported. 

Again, I thought your article was excellent, but have misgivings about your citation of LibLime Koha instead of Koha as an example of open source software.”

My ill thought out but honest response:

“Hi Greg,

I appreciate the feedback.  Looking back at the article I’m a chagrined about that.  I admit I’m an outsider in the Koha community though I have a fondness for any open source library project.

Just last week got a chance to chat at length with a gentleman [name and association redacted to protect those who didn’t give permission to be used].  He actually reached out to me because of an upcoming talk I’m doing. I was aware of some community conflict with LibLime but he gave me a lot of context of the Koha VS KOHA issues.  Suffice it to say that if I had known I would have mentioned Koha differently.  Technically what I said is correct but obviously doesn’t address the serious community concerns there and looking at community is central to the issue I wanted to discuss.  

Maybe on some level it’s best to not have written about that there.  It really is an issue that deserves discussion in more depth.  I’ve thrown out the idea to the editors of CiL of doing an all open source issue (it’s been about four years since they’ve done one).  If that happens I would love to work with someone to write about the Koha community issues in more depth. Still, whether it was the place for it or not I think I would have written that bit a wee bit differently.  I’m always glad to get opportunities to trigger discussion (even if the price I pay is putting my foot in my mouth occasionally).   “

 

Looking back and getting to read my article again it doesn’t really detract from it.  It’s just a quick reference at the beginning but I do regret it and feel that I should write something about communities in open source projects as a follow up which makes me start thinking about projects beyond Koha and Evergreen, failed and successful to look at.

Inventories With Evergreen

Recently I’ve gotten a lot of questions about doing inventories in Evergreen because I’ve proposed, and am looking for funding partners for, a full fledged inventory component to Evergreen.  I’ve heard folks complain about this missing for the past five years both inside and outside my consortium.  

Inventories themselves can be complicated or not but follow a fairly simple recipe:

1) Find out what you have.

2) Compare it to what you should have. 

3) Correct the exceptions. 

Within that you can have a lot of diversity.  What are your controls on what you have?  Libraries have items moving around all the time so there are a number of variables to control for.  And what level of correction do you perform?  This can range from the simple to very, very complicated.   Additionally, are you doing this in item buckets or via SQL?  That determines a lot of the scale that you can perform operations on.  What is your project management like and are there functions you want to have more segregated?  All of these are things I think baked in functionality will help with.  

Still, folks have managed to do their own inventories.   

Indiana has a great detailed writeup of their process here:   

http://www.in.gov/library/files/How_to_do_an_Inventory_Using_Evergreen.pdf

There was a presentation also done at the 2012 conference but the site’s gone now and I can’t remember who it was:  http://evergreen-ils.org/dokuwiki/doku.php?id=conference:2012

However, I get a lot of questions to the effect of “that’s all nice and good and the development would be nice but I need to do an inventory now and we need something simple.”  So, here is the barebones I process I setup for an SCLENDS member library and how I helped a Windows based admin set it up via SQL.  The admin can do it without SQL using smaller batches in buckets but there are several steps that are more difficult.  

 Mind you, that where this can get most complicated is in correcting for human error, the kind of thing that when it’s built into Evergreen we can have the computer do more of for us since at this point we have to manually tell tools to do these checks each time rather than have had a programmer tell a computer in advance to do so in a stored manner.  

This process isn’t perfect but can be done on large or small inventories and asynchronously by multiple groups. 

Step 1) Set everything in the set of items to be inventoried to trace if it’s current status is checked in.  

Step 2) Setup carts with scanners and laptops.  Ideally use two person to each cart.  One moves materials while one scans.  Scan into the item status screen using the barcode.  Make sure that all item screens are setup the same to show barcodes.  Turn up the volume on the error blurting noise for a misscanned / not barcoded item.  Pull items that aren’t in the catalog.  When done scanning save the file with information indicating branch and shelving location out to a delimited text file.  All you really need to display is the barcode.  If you’re willing to pass on finding items that shouldn’t even be there during the initial pass you don’t even need an Evergreen client.  With some libraries having spotty wifi being able to just scan into notepad was really nice.  It also made it more reliable.  I found that the Evergreen staff client would occasionally crash causing people to lose work.

If you do have a very high volume of errors in your collection you will want to do smaller chunks and not have groups work asynchronously.  If this is the case you may want to display status to correct as you go, especially checking in checked out and lost items.  I don’t recommend correcting checked out items by batch as there are a lot of potential variables that impact on customer service unless you’re doing something like blanket wiping out associated charges in their favor.  You may also want to show shelving location and branch to resolve.  An item’s call number will usually show you where something is out of place but not always, especially if it’s the wrong branch or material from another library.   

Step 3) Now, combine text files. This same functionality can be done in buckets but I found relying on buckets was too slow for doing large updates.  If you can do it via SQL having it in text files is convenient.  However, to avoid losing work with the staff client this meant a lot of small text files.  I would sort them out by shelving locations into different folders and then combine them.  These commands work on Mac and Unix and Windows with Cygwin I believe.  

ls > filestocat.txt  

xargs < filestocat.txt cat > barcodes.txt

Essentially you’re making a list of everything in a directory and then using that as a list to combine the files in one big data file. 

In a perfect world this step is done.  However, in the real world invariably folks won’t follow some part of step 2 correctly (another reason for baked in functionality).  So, the list will probably need to be brought into Excel (or other tool for working with delimited data) and have mismatched columns corrected for.  

Step 4) Do reports and corrections.  This is the point at which you can get fancy or keep it simple.  Reports should be used to find things out of position.  You can do this manually or even have staff go through a list and find them.  If there are a huge number out of order you may be better off just doing a shelf reading.  At a minimum run an update statement to move anything in the list currently marked as trace to checked in.  You want to check for the current status of trace in case it was checked out in the interim.  You may want to run a list of the trace items for staff to look for.  You may want to do updates to correct for branch and shelving location in case those are wrong.  You may want to batch delete everything still listed as trace.  Whether or not you want to do a second inventory pass will depend on how many exceptions you found.  

Step 5) Rest and bake in the relaxed feeling of a perfectly shelved collection.  This usually lasts in minutes rather than hours.  🙂

———————————– 

None of these steps are sacrosanct.  Each organization will probably adjust them.  The needs of your organization will determine much of that.  But all of this has a lot of steps that are repeatable tasks that computers can do better but right now we have to manually manage.  However, instead of just adjusting a few preferences or org unit settings right now we have to adjust the work flow and documentation significantly for each change and trust to humans to be accurate and precise each time instead of letting the computer do the work for us.

 

 

Evergreen in Library and Book Trade Almanac 2012

 Library Book and Trade Almanac 2012

Library Book and Trade Almanac 2012

There is an article in this year’s Library and Book Trade Alamanac (57th Edition) about the consortial effect that only large scale ILSes like Evergreen can provide.  In fact, data from one Evergreen Consortium is used as the basis (SCLENDS).  I co-wrote the article with Bob Molyneux … well, it might be more accurate that I provided data and helped revise but Bob really wrote it but I was honored to work with him on it.  I’m hoping next year to do a followup both with more SCLENDS data and data from another Evergreen consortium.  At least one other has already promised me data!

http://www.amazon.com/Library-Trade-Almanac-Bowker-Annual/dp/1573874396

A Practical Approach to Bibliographic De-duplication

De-Duplication Project

[ This is a duplicate of what I posted on sclends.net/projects/ ]

Early in the days of SC LENDS we faced the challenge of strict bibliographic de-duplication methods leaving our catalog messy for both staff and public. The issue wasn’t aesthetic but affected the services and workflow we offer. Unwilling to accept the common wisdom that we had to live with it we developed our own solution. The documents below give our history with the project and the code developed for it.

This is the presentation done at the Evergreen International Conference on the project and is the story of the project:

10% Wrong, 90% Done: A Practical Approach to Bibliographic De-duplication
http://sclends.net/wp-content/uploads/2011/09/10PercentWrong.pdf

This PDF describes the technical aspects of the project:

Bibliographic De-duplication Based on Narrow Data Element Matches Between Records
http://sclends.net/wp-content/uploads/2011/09/SC_LENDS_De-duplication.pdf

This is the actual SQL code developed by Equinox for us:

http://sclends.net/wp-content/uploads/2011/09/sclends_dedupe.txt

10% Wrong 90% Done: A Practical Approach to Bibliographic De-duplication and Bibliographic De-duplication Based on Narrow Data Element Matches Between Records are both licensed by Rogan Hamby and Shasta Brewer under:

Creative Commons LicenseCreative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Open Source Reality Check

Library Journal just published an article they interviewed me for entitled Open Source Reality Check.  

http://www.libraryjournal.com/lj/home/891350-264/open_source_reality_check.html.csp

Over all, I thought it was one of the more balanced and fair articles looking at open source from a high level perspective that I’ve read in main stream library journals.  Open source isn’t perfect, it’s a human endeavor after all but it is a proven model though indivdiual projects can fail.  The article spends a lot of time looking at KCLS, which is to be expected.  The bit they quoted from me was me discussing, and dismissing, the idea that open source is somehow magically less stable than commercial software.  The factors that make an open source project stable have to do with scale, which is also true of commercial software – it is the metrics that vary.  

Oh, and they got my place of employment wrong.  I was at the time of my LJ Movers and Shakes award with the Florence County Library System but I’m now the Director of IT and Innovation for the State Library in SC.

Conference Presentations

I’m only about month late but I’ve finally gotten my presentations for the conference up on Slide Share, linked at the bottom.

The deduping presentation was my favorite to prepare for and feedback was both mixed and wonderful. A lot of people had reservations about our approach, as they should. We decided to make trade offs and when you do that you should carefully consider the ramifications. We’re still trying to get the final stamps and notaries and DNA samples to sign off on publicly releasing the code. I thought it would be done a good while back but we haven’t stopped moving forward on it.

I loved seeing interest in new ways of thinking about challenges like record deduping. The kinds of consortiums that Evergreen is fostering are new in many ways and bring with them new twists on challenges and with that a need for coming at solutions with open eyes. I’m always enthused to watch the development list because we have so many great coders in the Evergreen community who are attacking problems constantly and reinventing things. Coders are (usually) good at being willing to get rid of outdated code and adopt new approaches. The failures to do this have direct consequences and are sometimes pointed out in the open source world as warnings to others, like tragic beached whales. And while our coders are often great people who love libraries they aren’t the whole community. Too often we look at software as both a tool and a solution and some solutions come from outside the packaged software. Librarians at a whole need to be better at that and taking ownership of our own problems.

I’m proud to say that SC LENDS owns plenty of problems and deduping has been on of those so we’ve found our own solution and the conference has made me eager to work on it more and further refine it. This is definitely one of my goals before our next big wave of migrations.

The Becoming Our Own Vendor presentation was interesting because I’ve taken a long route in thinking about the issues in it. I’m a details person and I’m good at tying details to a big picture but I think in terms of consequences not theories. Framing these issues in a conceptual model for communicating to others is not my best talent though I continually work on it.

I originally thought about it as a discussion on governance issues. Then I realized while working on it that governance isn’t really the issue, it’s one of contributed labor and that is a very open source issue because we’ve adopted a very meritocratic approach to dealing with them. And then at some point after the conference I realized in one of those “how could I have been so stupid” moments that of course in open source labor is governance and it’s not that I haven’t been talking about governance all along and I did what I set out to but like the vertically challenged and the elephant I’ve been seeing it differently depending upon where I stand because I’m very close to it.

http://www.slideshare.net/roganhamby/10-wrong

http://www.slideshare.net/roganhamby/becoming-our-own-vendor