A Partial History of SCLENDS

A few weeks ago Equinox Software published a blog post I wrote about Evergreen in 2009. My first draft and my final draft were very different. Draft by draft I stripped out the history of how SCLENDS started, not because I didn’t want to tell it but because in the larger Evergreen context it wasn’t what I wanted to say. The very fact that some remained though and that I did start with so much tells me something. It is a story I want to tell and while that post wasn’t the place, this is. Why? Honestly during that first year we did a lot of “make it work and fix it later.” Document? If there’s time. It’s easy to be critical of that approach but we had tight deadlines and if it hadn’t been done the way it was it might never have happened. But now I have a little time to write it and want to do so while my memory is clear, at least of the elements that stand out in 2009.

I’m not going to claim this is a complete history. Beyond the fallibility of memory I doubt I know the whole story and it’s naturally biased towards the events I was present for. SCLENDS was started by many people, library directors, circ managers, systems librarians and more. I worked with most of them but some only tangentially. No single person was present for every conversation and no person could know the whole story. And since I’ve admitted that this will be an incomplete telling I will also offer that I’m going to try to keep it brief. The story begins properly with the development of writing in ancient Mesopotamia and Egypt … just kidding.

In 2008 I was the Systems Librarian in Florence County, South Carolina. The library’s director, Ray McBride, and I had been deeply involved in the process of re-evaluating our technology plan. One thing we were not concerned about was our ILS. We were very happy Horizon users and had assumed that we would upgrade to Horizon 8 when it was released. It had already been delayed but why would we consider other options? Going out for an RFP is a process to be avoided like an invasive unnecessary medical procedure. Plus, we were happy with Horizon, it was user friendly, it fit our needs and was stable. Sure, it had gotten a little long in the tooth but the upgrade would give it the refresh it needed.

Then one day I was reading through my daily mail and there was a correspondence from Sirsi-Dynix. Horizon 8, Rome, was being canceled. Instead they would take the modern code base of their other product and merge it with the user friendliness of Horizon and like tunes being played together it would be Symphony. It was the kind of over the top marketing speak that made it clear they were trying to make users feel positive about news they knew we would be unhappy with. They would have been right about the unhappy part.

Fast forward and we had a meeting. I had compiled a list of possible ILSes we could upgrade to. Polaris was a strong contender. We seriously looked at Symphony, hoping for the potential of an easy migration. There were others we dismissed due to expense or lack of features. There might have been another we considered that I can’t remember now. And I threw Evergreen onto the stack for consideration.

Why did I suggest Evergreen? Florence was an almost pure Windows server environment and this was a radical departure. I didn’t try to convert the Florence environment to Linux despite my preferences because with the staff limitations the library had and applications they had invested in running within a Windows environment, Microsoft made sense. Migrating to a mission critical application on Linux was a big departure. But, when I looked at the growth of open source, what I saw happening in the Evergreen community and my own opinions about the relationship between open source and library philosophies I was of the conviction that we should consider it. Not go to it, just consider it. Frankly, with my time limitations an easy upgrade to Symphony sounded pretty good to me.

We formed a committee of public service staff and administrators. We invited in representatives from companies to talk about their ILSes. Evergreen was open source so I distributed a fact sheet. We had reps from Polaris and SirsiDynix come in. We talked to other libraries. One library referred to recent updates to Symphony in …. unflattering terms and told us they were migrating to Polaris as soon as they could. Others were only slightly kinder. Polaris looked good but didn’t blow us away. A Sirsi representative made it clear that migrating to Symphony would not be like an upgrade and there was Horizon functionality that did not have one for one parity in Symphony.

Discussions were lively but in the end we selected an ILS: Evergreen. At that point Evergreen was about version 1.2 and rough. As we talked about it one theme came up again and again. We believed that whatever shortcomings Evergreen had at that point in mid 2008 that it was the right long term choice for us. We believed that in time it would match and exceed the other options we had to pick from. We also wanted a choice that we felt would last us ten years. I think it was Ray who said later that this would be the last ILS a library would ever need to migrate to. He may well be proven right, only time will tell.

It can be strange what you remember. It was a Thursday afternoon in November that I was having coffee with Ray. We were discussing Evergreen and forming our plans for the migration. One of my concerns was the long term support, especially if I left. We began discussing approaching an external company for support of our servers. That would give me more time to spend in the community and support regardless of staff turnover. As we looked we also began to discuss moving to remote hosting and increasingly liked the idea though it meant moving nearly all technical management external to the library, not something we had traditionally done. However, while we had put a lot of value on internal staff management of technology we also had increasing needs without an increasing budget so going with a remote hosting option made sense.

All of this, especially the budget concerns, was in my head when I threw out another idea. In one sense, this was the start of SCLENDS. What if we invited others a to join us to start a consortium and reduce costs? Ray liked the idea and threw the idea out to the South Carolina library director’s listserv. From there I become a peripheral part of the story until January. During that time in the periphery I was aware that the offer was expressed and interest returned. I was tasked with inviting a vendor who could run servers for us.  The clear option was Equinox, having been founded by the original developers and administrators of Evergreen at Georgia PINES.  Additionally, they had a lot of experience with startup consortiums so they would understand what we were embarking on.

December passed and January of 2009 arrived. I found myself in the large meeting room at the Florence Library. The interested libraries were arriving. Eleven libraries in total attended that meeting, interested in sharing costs and materials in a new consortium. That meeting brought together not only the directors but systems administrators and circulation managers of the libraries.

Eleven libraries were present and ten of them went on to form SCLENDS. Honestly, that day was a blur of faces and voices. One person whose name I don’t hear mentioned much in connection to SCLENDS is Catherine Buck-Morgan and it should be. Although I don’t know this for fact I suspect she is the one who created the name (had it been left to me I probably would have chosen something tree related). Additionally, she was a critical part of this happening. It may have happened without her involvement, it may not have, I don’t know. I do know it wouldn’t have happened as quickly and the way that it did.

Catherine was the head of IT at the State Library and closely involved with the distribution of LSTA money in the state. I later discovered that she had already written a concept paper for creating a resource sharing consortium in South Carolina. I don’t believe her idea was inherently based on open source but she did cite PINES as an example of what she was thinking of in terms of resource sharing. Her idea hadn’t been circulated outside the State Library but this had dovetailed with it perfectly. She was critical to getting us LSTA funding to kickstart the migrations.

SCLENDS would quickly move over to a self sufficient model independent of LSTA and State Library money but those funds paid for the first two years of hosting and many of the migration expenses over two fiscal years that included our first three waves of libraries. Partial funds also helped one later wave.

Honestly, I thought the idea would be a much tougher sell than it was. Eleven libraries attended that first meeting and I had imagined half would back out. In the end only one, Greenville County, chose not to join SCLENDS, objecting to sharing their videos with other libraries. Most of these discussions happened in January and early February. Then we got to work. In less than five months, driven in large part by a window of opportunity for grant monies, we went from a first meeting to go live.

Wave one went live in late May 2009 and consisted of the State Library itself, the Union County Library and Beaufort County Library System. I later went to the State Library myself for a tenure at the IT Director there where I ironically ended up working with the Union County director, Nancy Rosenwald. We had both taken positions there and had offices next to each other. I really enjoyed working with her both within SCLENDS and at the State Library. She also had good taste in tea. Beaufort had one of the most dramatic go live days when a construction crew cut their fiber line during the first day of go live. The story the local newspaper printed was essentially “Evergreen Fails” instead of “No Internet at Library.” I understand they later printed a retraction in small print in an obscure text box. Ray McBride after a stint as a museum director even took over the library system there proving that it is a very small world. I discovered that Beaufort had been investigating Evergreen in 2008 as well though not as far along nor with plans as definite as our’s in Florence.

Wave 2 was in October of 2009 and included Fairfield County, Dorchester County, Chesterfield County and Calhoun County. Frank Bruno of Dorchester I think I fought with as much as I agreed with. I remember his staff loved him because he supported them. He passed away last year and the world is poorer for losing him. Drusilla Carter left Chesterfield for Virginia where she helped start talks that may have led to their own Evergreen consortium and eventually landed in Conneticut where she is a part of Bibliomation, another Evergreen consortium. Kristen Simensen is still at the Calhoun County library and fighting the good fight. Sarah McMaster of Fairfield retired right around the same time I left South Carolina and her last SCLENDS meeting was, I believe, my last one as well. Aside from personally liking Sarah as a person, professionally, there isn’t a library in the country that would not benefit from having a copy of Sarah on staff.

Finally wave three went live in December and included my own library Florence. Shasta Brewer of the York County library became a close co-worker of mine over those months and became the leader of the early cataloging discussions. Faith Line of Anderson had pervious consortium start up experience and continued to long be a voice that people looked to leadership on the executive board. I believe it was Faith her that suggested the creation of the working groups to aid in the migration that eventually became the main functional staff bodies of the consortium. Even when there were later attempts to expand or redefine them the original ones persisted in being the main ones. In Florence, Ray served as the chair man of the board during the infancy of the consortium and after leaving came back to another SCLENDS library.

And there were others – other staff, other stories and later other libraries which brought yet more staff and stories. SCLENDS grew over the next few years. But those stories belong in other years. I may or may not write about those stories some day but I think they’re better documented so there is probably little need. Did I leave some things out? Sure. The Thanksgiving Day Massacre. The Networked Man Incident. The Impossible Script Mystery. Probably others as well, and they make for fun stories, but aren’t core to the history I think.

– Rogan

post

Evergreen Conference 2016 and After Thoughts

Once a year I let some time pass from the Evergreen Conference before I try to capture my thoughts about it.  Finding myself in a contemplative mood this evening I finally decided to do it.

What should I write about?  The NC Cardinal folks did a great job, it’s an insane amount of work and they tackled it well.  There were a lot of great presentations.  The hospitality staff running the meeting rooms at the Sheraton were wonderful.  The Resistance was the game of the conference and I had a great time playing it.  As a member of the response team I was heartened that I was unneeded.  Honestly, I’ve been to far larger library events where they should make a model of the balanced, relaxed environment and professionalism of the Evergreen conference.  I had a great time meeting new folks at breakfasts and dinners.

My SQL pre-conference workshop went well.  One person told me that I really helped them with things they had struggled with.  Another told me they used the notes from my workshop last year during the entire intervening year.  Being told things like that make all the work worth it.

My statistics heavy presentation went well and I think I kept everyone awake even though by the end I had created more questions than I had answered.  I showed some clear relationships and likelihood of predictability of data if we can get enough data sets to compare and account for the variables influencing holds.  I think the data also clearly shows the value of sharing materials in a consortium.  I have a dozen thoughts on this that will be their own blog post at some point.

The biggest thing that stands out thinking back on it though is the lack of surprises.  In the early days of the Evergreen Conference I never quite felt like I knew what to expect.  Enthusiasm and passion for Evergreen are as strong now as they were at the very first Evergreen Conference but things have changed.  In the early days of the conference we had presentations about things like “How We Made Evergreen Work For Us.”  I stood at the front of the room doing a few of those myself.  Those are long gone.  The experiences, the presentations they reflect, for lack of a better term, are matured.  So has Evergreen.  So has the community.

We don’t have everything figured out but we’re not trying to figure out if we can manage the challenges either.

This weighs heavily on my thoughts because I saw an article that implied that open source software isn’t as mature as proprietary solutions.  I feel like the assumptions implicit were numerous and would take more time than I have here to deconstruct but again, might be a good future blog post or article.

Obviously, the perception of non-users of the software and non-members of the community doesn’t sync up with that of those who do use it and are members of the community.  I’m not saying my feelings are universal but upon talking to others I know they are widely shared.  So, why?

I believe Evergreen falls into a common pattern of technologies maturing.  Indeed, open source itself does.  Open source is a development methodology but it’s also a shared platform of technologies that build upon each other in chaotic way more akin to natural selection than design.  Why can people who see the adoption and maturation patterns of something like DVD players can’t see that it isn’t that different for software?  I don’t know.  Much like my consortial data presentation I feel like I’m leaving this with more questions created than I’ve answered but maybe that’s a good sign that I’m on the right path.

post

Sound and Fury: Choosing an ILS

I published this article a few years ago in Computers in Libraries.  Nothing in it will be revelatory for most open source advocates but at the time I got a lot of feedback from librarians that it was useful.  CiL’s exclusive publication window is long since past and I just thought of it the other day so I thought I would re-publish it here.  Four years later I still think it’s spot on though I would probably make some changes to either shorten it more or make it longer with practical examples.  


Few decisions cause a library director to fret more than choosing a new integrated library system (ILS).  A new ILS is expensive in money, staff, time and stress no matter what you acquire.  Additionally, the wrong choice can bear costs in morale with lasting consequences.  Sometimes it is easy to identify which ILS is wrong for you – the contract costs are too high or maybe the features are not present that you need.  But, too often selecting the right one is like going to a car dealership where everyone speaks in tongues and the price lists are encrypted.

This is the result of a decade of market disruption.  Once upon a time proprietary ILS vendors were not optional.  Picking the right ILS was fraught with danger but not conceptually difficult.  Two changes in the market have had an enormous impact.  One of these, the growth of applications as services has added new options to the ILS selection process.  However, it has been the growth of open source ILSes, such as Koha and Evergreen that have made it necessary to rethink the selection process. 

Choosing between an open source and a proprietary solution is not a choice between peaches and pineapples.  Frequently, it is assumed that the two types of ILSes cannot be evaluated by the same criteria.  In fact, they can be.  Although they result from radically different economic models and divergent philosophies, in the end both are products and services that can be defined by a library’s needs and resources for the purposes of acquiring.  Four major criteria must be compared – product cost, features, communities and support.  Until open source disrupted the ILS market one could safely ignore communities.  The community of a proprietary ILS product might have added value but it was unlikely to either make or break the selection of an ILS.  Now community plays a much more important role but that will come after we look at the other criteria. 

Perhaps the first thing to dispel is the myth that open source should be discussed as the cheap option. The wise library administrator will realize that while many of the best things in life are free, your ILS isn’t going to be one of them.  Your cost won’t always be in legal currency.  I have met staff so traumatized by a bad migration that they are still visibly shaken years later by what is now a stable and reliable tool.  The library has paid an ongoing price in terms of post traumatic stress and that cost can be too high.  The best migration is pointless if the library’s experience falls apart within a year or two causing the whole thing to happen again – an experience I’ve seen happen with both open source and proprietary.

Each of the four criteria can have multiple metrics as well as multiple vectors to plot them on for both a migration and on going support.  In the end a data set for ILS selection should probably look more like a scatter chart than a report card. Now, can we simplify the process?  The answer is yes.  A detailed consideration would be worthy of it’s own book but by taking a few conservative shortcuts we can sketch a road map for your selection process in a period of time that isn’t comparable to earning another master’s degree.  For example, we will assume that you have the same vendor handle a migration as ongoing support.  This will not be a road map for the adventurous.  This is for those whose boards will require that all core functions work on the day of go live with minimal surprises.  And being conservative does not exclude you from an open source solution.

First, do a needs assessment.  This is the point at which many upgrade processes fail.  Rather than say something like “we need acquisitions” or “we need EDI” do use cases and narratives.  These should create an unambiguous picture of your needs.  Be careful to not attempt to recreate your existing ILS.  This is the point at which libraries realize how deeply embedded their current ILS is in their operations.  The documents you produce at this stage will be used extensively in working with vendors.  Be honest about what you need and what is merely on a wish list.

Now, find your vendors.  Don’t even worry about the ILS itself yet.  That may sound heretical in an ILS selection process but you need to safeguard from fixating on a single product and not evaluating honestly.  Fixate on your needs instead. Some vendors will support multiple ILSes and at this stage you are looking at who can provide support during a migration and ongoing.  Look at each vendor’s ability to support hardware, provide reliable access, and expertise with the ILS, their ability to find solutions, their training resources and expertise at setting a system up.  Do yourself a favor and look in depth at their experience with data migration – it is surprisingly hard to do well.  Do not let a vendor make vague promises about your data.    Looking at vendors before solutions may seem to be putting the cart before the horse but in the long run the greatest frustration most libraries have with an ILS doesn’t stem from software but support.  At this point you should also rank yourself as a vendor to see if you want to fill some of these support roles yourself.  Be honest about your ability to sustain support.  Many libraries begin projects that falter when key personnel leave because the skills are not part of the institution.

Many open source advocates argue that support is an inherent advantage of open source.  Some libraries delay leaving an ILS they are unsatisfied with the support of due to the stresses of migration.  If you use a vendor for support of an open source ILS they cannot lock you into the ILS itself.   Once your contract is up, if you leave a support vendor and extract your data you can import rather than migrate your data into a new system.  That ease of changing support vendors without changing software means that open source support companies have to compete on the basis of support because the threshold of difficulty for the library leaving is reduced by orders of magnitude.

The next step is to define what kind of support contract you want.  Do you want a local install with minimal support, do you want local with remote administration or perhaps an application as service where you sign a check and everything is just made to happen.  At this point evaluating yourself, as a potential vendor, will help you determine if you want to exclude yourself.  A product supported fully by a reputable vendor with skilled support staff is what you’re looking for.  Increasingly, the choice most libraries make is buying an application as service.  Vendors can take advantage of high capacity Internet connections and big virtualization systems to achieve economies of scale and offer remote hosted ILS services much cheaper than a library can locally offer it.  But, you may have factors, such as response times needed, which make a local installation more attractive.  Knowing what kind of support contract you need you can begin looking at the packages offered by the vendors and dramatically simplify the rest of the process.

Next, make two lists to look at support and features separately.  Vendors need their feet put to the fire to answer if they can fill your needs, which is why the use cases and narratives are critical.  Find out what the vendors’ uptime guarantees are, what their response times are and what tiers of support they offer.  For example, do they handle user interface level troubleshooting, will they do custom development to solve issues, or do they simply do systems administration?  What services do they offer during the migration?  Can they extract your old data?  Can they offer project management or training?  Will they offer documentation?  Now, some of these resources may originate in part or whole from a community but at this point worry about the availability through the vendor and their obligation to you to make it happen.  List the support levels of each vendor.  Go back to Buying An ILS 101, call references and do every other thing you would do with any big-ticket purchase.  

Parallel to support, review the ILSes themselves and isolate what software will be viable for you.  Make sure the vendors support those features and how you want to use them.  I’ve had clients spend a lot of time preparing to move to a system only to have their vendor say, “we don’t support serials” even though the ILS has the functionality.  Return to those use cases and narratives your staff developed earlier.  While sharing the use cases get a detailed analysis of what your narrative experience using the ILS(s) will be.  If they can build a comparative scaled system (number of patron, bib, copy records, etc..) for you to test against this is critical for applications as services.  More than one library has been burned by not seeing their data run at scale and not doing hands on features testing.

Think about future features too.  Will there be things you can’t anticipate or live without?  Will the new social network that everyone gushes over be critical two years from now?  Can the company you are working with provide you with development options?  If not, then open source may provide you with other kinds of development paths depending upon the community surrounding it.  What about wish list features?  Maybe like William Henley you want to be the captain of your own soul, or at least ILS.  Ask yourself if you want to make changes to the software and control those changes in the future.  If the answer is a firm yes, then you probably want an open source ILS and will need to allocate resources for development.  Don’t automatically discount a proprietary vendor but giving you that control is not usually a part of their business model.

It is also worth asking if you want to be part of a consortium.   Although really large resource sharing consortiums aren’t unique to open source they do seem to be more common with growths in the Evergreen community, like SCLENDS.  Materials sharing may or may not be on your agenda but adding to an existing installation has a lot of advantages including a built in local community to draw on.  

Since applications as services are delivered over Internet connections it is important to know the impact they will have on your connection.  Prolonged profiling will tell you when you may have interruptions in service and what delays in response time you may have.  Map obscure phrases like “ping times” and “drop rates” to real measurements like “it will take 2 seconds to check out an item.”  Often time, work flows can be adjusted to handle the increased latency from moving an ILS from inside your network to remote hosting but an unexpected impact like that can heavily damage morale.  This is a time to bring in heavy-duty network expertise and make sure they go over issues with a fine tooth comb.

Finally, we get to every library’s least favorite topic that isn’t protected by confidentiality laws: budgets.  Take those support options and the ILSes by the vendors you find acceptable and map them against how much you have to spend.  Any that you can’t afford, toss.  What you’re left with is ILSes that will work for you, companies you can trust to support you and an experience you can afford.  Be wary of rushing into support contracts for applications as services though.  Compare your costs across the lifespan of the longest contract you would have to sign – which should be three years.  A longer contract than that which locks you in should be a concern.  Make sure there are guarantees about maximum rate increases and reasonable rates for extracting data. 

Many an ILS has been chosen because the library administrator feels overwhelmed.  An implementation by the current ILS vendor can seem like an easy and safe choice. That is a poor assumption to make.  In the course of development or corporate acquisitions sometimes the upgrade path defined by a vendor is actually to a whole new product.  When that happens an upgrade is really a migration.  So, donÕt be deceived by the potential level of difficulty of the project.  Vendors like to define upgrade paths because they know many local governments provide clauses that allow organizations to upgrade without going through a competitive bid process.  ThatÕs also how you can get stuck doing two migrations in two years Ð something no one wants to do.

At this point you may be ready to select an ILS but you should take one more step.  So far we have flattened out the modern twists to ILS selection and used a model built on common sense.  The next criterion is not a leap into uncommon sense but it is much harder to define.  Evaluating community requires the administrator to understand how their staff as professionals will interact with a larger community rather than perform workflows.  Community is not an open source specific criteria though it might be more central to those ILSes.  The communities of proprietary ILSes can be hampered or facilitated by the corporation linked to the ILS.  Open source ILSes are built by their communities but may still have large corporate presences.  When evaluating those dynamics don’t frame the discussion as business versus community, as that’s a false comparison.  Evaluate the businesses as members of the community by their actions and consider that when developing a picture of the whole community.  

As you investigate vendors, how they interact with communities might tell you something about the character of the company.  Does it allow for independent user groups and conferences?  Are there emails lists and public forums?  Are there places to share and ask questions of others who use the ILS?  Some proprietary ILS vendors have encouraged these things and allowed outside repositories for documents.  In open source communities these are the norm.  Do not underestimate the value of community.  Not only do ILS communities help you make the most of one of the largest pieces of your infrastructure but also an active engaged community can be invaluable for the professional development of your staff.

Look at your resources and ask if you are the kind of organization that is ready to be part of a larger community or if you prefer to play alone on your own ball field.  Sometimes it is the larger libraries that are less prepared to be strong community members because they are accustomed to making decisions as an independent entity.   In the end you may choose to leave community out of your considerations for an ILS selection. However, at least some awareness of the larger community should always be there, to compare experiences with a vendor at the very least.  If you are using an open source ILS vendor and you successfully vetted them they should be involved in the community and may be a gateway to you becoming involved in the future if your priorities change.

At this point you have decided on the viability of a given ILS migration and looked at communities as added value.  Do you need a tiebreaker query?  If you do look at what your gut tells you.  The truth is that for all of our development as a species we still sometimes process information subconsciously and have gut instincts that lead us well. Do you have a philosophical leaning towards open source?  Does one vendor click as a partner? 
 
If you are willing to endure some hardships you can play fast and loose with this process.  Risks can pay off but it’s a luxury most boards don’t give their directors.  Open source succeeds where it is the best solution, not because of philosophical biases just as commercial software succeeds when it does so on quality, not on spreading fear, uncertainty and doubt about competition. As we look critically at these solutions, their vendors and communities we also have to look to the future.  Mark Twain said a sure way to look a fool is to try to predict the future.  But we need a sense of how the future of these ILSes will unfold since we will be tied to one once we make that selection.  Communities and companies can be filled with amazing people who can make all the difference, and they can fall apart.  Engagement with partners in companies and communities are where we will the future unfolding.  We need to remain aware of these dynamics – they are often why we end up moving to a new ILS after all.

Evergreen Conference 2015

Another annual conference has come and gone.  2015’s seemed short.  It wasn’t, in fact, but the time seemed to go quickly.   I admit I’m not as good a traveler as I once was.  Part is due to age, and part is routine.  I always wake up the same time of day.  Even daylight savings causes me difficulty adjusting.  Flying to a different coast and telling my body to adjust three hours is nearly impossible.  Then five days later I do it in reverse.  In between I’ve run nonstop for days on end.  I interact with people and burn every ounce of introvert energy I have.  I also run on little sleep, staying up late and getting up early.  And it’s worth it.  By the time I drag home I feel a bit like Toshira MIfune in Yojimbo, crawling exhausted and battered under the house hoping to just get away from everything and recuperate.

But, I do it each year because it’s worth it.  This year I taught a half day SQL workshop, I served on a panel welcoming new folks to the Evergreen community and I did a presentation on extending data sources in the reporter.  You can see all that from the conference schedule.  But it’s far more than that.  What I gain isn’t entries on a vita.  Even a few years ago attendees of my SQL and Reporter workshop would have been staff with very tech oriented roles.  The sessions were full with 20+ people in each and they were librarians.  Yes, tech curious but by no means systems administrators – traditional librarians who want to dig deeper and deeper into the power that Evergreen can provide.  I like to think I helped make some materials more accessible to them and the fact that this new power user class is growing in the community is a wonderful thing.  That additional depth and breadth in the community is a healthy thing.  It means that the idea of a tech curious librarian is increasingly irrelevant.  Every year that I use a phrase like that it sounds sillier and sillier and I’m happy for that.

And to paraphrase Billy Shakespeare, the community is the thing.   Attending the reporting interest group I talked about the need for new core reporting features with staff from all over the country (and world), about the need for existing and new libraries.  I think we need to bring back core reports into Evergreen as something that is expanded and tested with each version and it’s something I hope to work on this year, starting with going through the ones that were developed for 1.6 and updating them.  I talked with folks from Indiana about homebound services and something vaguely (but not quite) like plans were made. But talking is a starting point.  I talked about philosophies and their practical import.

Many bad jokes were made (by me) and a few good ones (not by me).  We compared war stories, planned for the future, discussed what ifs and shared discussions about the meaning of life, or at least governmental ethical obligations and spending regulations.  Talking to other consortiums is always illuminating.  So is playing board games late into the night (I won Stone Age but it was a bit unfair since most of the others hadn’t played it before).

I left after three years on the Oversight Board having completed a three year tour of duty.  Several folks expressed surprise at my tour ending.  Three years go quickly.  A few also asked why I didn’t run again.  The truth is that we instituted the format of rotating members off the board so that it wouldn’t become stagnant.  Our community is large and diverse.  I want to let new voices in.  I may run again in a year or two.  I may not be able to vote but I’m still around and I promised to come in and sit in on meetings when time allows.  I also agreed to remain on the merchandising committee and to assist the board with some special issues if they come up.

I’m also making some changes to the Hack-A-Way.  Submissions are now open for 2015 and will remain open until June 19th.  However, we are moving to an annual model for the Hack-A-Way.  With it now in it’s fourth year it’s become an institution.  As the kickstarter of it I still think of it as a scrappy little thing that has to prove itself so seeing folks planning far in advance and competing to host it surprises me.  But, it shouldn’t.  I myself have pointed out the good work that has come out of it each year.  So, a year wraps up and I head home to recuperate.

I had to leave before the developer update was done but I know the gist already.  The new staff client looks amazing.  I would be tempted to say that we should do a second (unusual) upgrade in 2015 but with so many other projects on our plate it’s probably not in our stars.  And maybe it’s best to just go over to the new staff client all at once anyway.  The new infrastructure also opens a lot of new doors I think.  But all that is left behind as I fly back to the east coast and just worry about getting gate to gate.

Today I returned to work, jet lagged and exhausted.  But in a way the conference lingers, it’s effects reverberate in strange frequencies and conversations will continue on in IRC and by email for weeks and months to come.  Really, we think of the conference as a distinct moment in time but it’s more of a peak of a sine wave that goes on and on.

Hack-A-Way 2014 Wrapup and Photos

I’ve uploaded my photos from the Hack-A-Way to a gallery on my site and (more importantly) the Evergreen community Flickr account.  See them along with somewhat lame commentary here:

https://www.flickr.com/photos/evergreen-ils/

You’ll notice in the photos a lot of people quietly typing.  There was discussion but the nature of a hackfest is a lot of collaboration and coding.  And this hackfest had coding, tutorials, documentation and more.  Indeed, many people came in with things to talk about, things to resolves, things to learn and things to work on together.  It was great.  One person said they wished we could do this several more times a year.  That’s probably not practical but the fact that it gave that feeling of being useful made me feel good.

It was a good but exhausting week.  I started with picking up materials the Saturday before and it just went on from there.  This isn’t to say that I did it all.  Other staff, at the York County Library, were critical to pulling this together and although their roles were sometimes invisible to participants, trust me when I say that everyone appreciates each of their efforts immensely.   For me it wrapped up just a few hours ago, a week later, dropping off a few colleagues at the airport and doing this blog entry.  

Several people commented on how productive it was and big progress was made on several fronts. 

http://wiki.evergreen-ils.org/doku.php?id=hack-a-way-2014

The Evergreen wiki page and linked collaborative Google Doc outline a bit of what happened.  I also tried to highlight some of the more offbeat moments on Twitter under the #egils and #hackaway14 hashtags.  Well, at least the PG rated events.  The exact language a few points may not have been copied verbatim.  I think that would have raised it to PG-13 in one or two cases.  And I do regret not getting the beat boxing on video.

We didn’t fix the entire world’s (or even all of Evergreen’s) problems but we made progress.  We looked at Evergreen issues and compared issues with specific installations.  We talked about big picture issues that affect the future of the community.  We groused, we pontificated and just shared opinions.  And we ate BBQ. 

We talk about community in open source a lot but when we talk abstractly it’s about faceless sources of email and git commits.  Events like this, even more than the conferences, bring home how human that community is.  I’m lucky in that I like these humans.  I like spending time with them and like working with them but it still makes for a very long week. 

I learned a lot of new things this year that I hope to put into practice over the next year and soon enough #hackaway15 will start it’s own planning process.

 

 

Hack-A-Way 2014 Day 1

I’ve been organizing the Hack-A-Way for three years, since it began, but this year it’s come to my own library in Rock Hill, SC.  SCLENDS has been active in the Evergreen community as much as our resources could allow from the beginning and this has been the first time we’ve had an Evergreen community event in South Carolina.  While I’ve been both happy and proud to host it myself this year it also reminds me of how much effort past hosts (Equinox, Calvin College) put into it.  We’ve learned each year from it and it’s evolved.

While Hack-A-Way was originally conceived of as a two day event with a “pre” day like a pre-conference I think it’s time to simply change that idea to a three day event with the acknowledgment that some people may arrive at various points during the first day.  I’ve also in the past not started looking for hosts until after the annual conference.  As it’s been a fairly low key event with a small group of technical members of the community I didn’t see it as needing a lot of lead time.  Of course, the numbers of attendees has grown (though not dramatically) and the standards for hosting have been raised by the first hosts.  Now, I think I will start looking for hosts earlier, maybe as soon as when this one is over.

We did a lot on the first day, alternating between group discussion and working together on small projects.  We attempted to extend our remote participation via jit.si but tomorrow will fall back on Google Hangouts.  Tragically, love for FLOSS projects sometimes has to bow to effectiveness.  And, as usual, we use IRC.  Some of the topics can be found at the Evergreen WIKI at http://wiki.evergreen-ils.org/doku.php?id=hack-a-way-2014 where you can also find the working Google Doc we are taking notes at though more happened not quite captured there.

You can also follow along on twitter using the hashtag #hackaway14

And now, the day, in brief, in pictures,

It turns out that a bunch of developers and Linux admins are the wrong people to troubleshoot Windows.   It turns out that a bunch of developers and Linux admins are the wrong people to troubleshoot Windows.  “Charms bar?!?  Is it really called that?” was said at one point. I didn't trust the wireless so I provided a gigabit switch with plenty of cables. I didn’t trust the wireless so I provided a gigabit switch with plenty of cables. Do you trust the future of your ILS to these guys? Do you trust the future of your ILS to these guys? Let's backport that, what could go wrong? Let’s backport that, what could go wrong?

 

post

SQL for Librarians

Here it is, SQL for Librarians.  I closed out the Cambridge Evergreen Conference (for good or ill) and actually keep a few folks there until 12.  I had a lot of great comments so I think it was fairly successful despite being a tad loopy from allergy medication.  And I blame the medication for a few things that upon listening to this I cringed at.  In a perfect world I’d love to do this again and do it with a full workshop format.  

Slides: http://www.slideshare.net/roganhamby/sql-for-librarians

Youtube: https://www.youtube.com/watch?v=3Iz-HFiDq6E

Hack-A-Way 2013 Day 2

day 2

I should have said this on the outset of yesterday’s post – Hack-A-Way 2013 hosted by Calvin College and sponsored by Equinox Software.  I have no obligation to mention those in this forum but they both deserve the recognition (and far more). 

Priority one for day two was finding out how to hack hangouts so that my typing didn’t mute the microphone (which they couldn’t hear anyway since I was using an external microphone).  Some quick googling uncovered that this is a common complaint by people who use hangouts for collaboration and that there was a an undocumented tweak that only required minimal terminal comfort.  I’m still tempted to get a second laptop to make it easier to position the camera though and I’m definitely bringing the full tripod next time.  But, AV geekery behind me …

We started with reports on the work the day before.  

Ben Shum reported work on mobile catalog.  That group was the largest of the working groups and had laid the ground work of saying that it should have full functionality of the TPAC and that was the goal.  The team worked on separate pieces and working on moving files into a collaborative branch on working repository.  A lot of the work is CSS copied from work done by Indiana as well as de-tabling interfaces and using DIVs.  

Our table worked on a proof of concept for a web based staff client.  Bill Erikson had previously done a dojo based patron search interface and checking out uncataloged items as a proof of concept.  We worked on fleshing that out, discussing platforms for responsive design, what would be needed for baseline functionality (patron search, checkout, see items out, renewals) and then later bills.  This is less a demo at this point than a proof of concept but one goal is to have something that might in a very limited way, with some caveats, also help those suffering from staff client memory leaks by having something that could handle checkouts without the staff client.  It is also bringing up a lot of conceptual questions about architecture of such a project.  Working directory and dev server are up.  Most of the work on this is being done by Bill and Jeff Godin with input from the rest of us.  

Lebbeous Fogle-Weekly reported for the the serials group.  They targeted some specific issues including how to handle special one off issues of an ongoing series and discussed the future direction of serials work.  In fact they already pushed some of their work to master.  However, because of their narrower focus they are going to break up 

Jason Stephenson worked on the new MARC export and has a working directory up.  New script is more configurable.  At this point I missed some of the conversation unfortunately due to some issues back home I had to deal with but apparently in a nod to Dan Scott MARC will now be MARQUE.  

In evaluating the 2.5 release process we spent a lot of time discussing the mostly good process and the big challenge the release manager had in it.  The community goal has been in making more stable releases.  During this release Dan Wells added more structure was good, the milestones, pointing out bugs was good but he also wanted feedback which was really hard for the developers who were very happy with his work.  But there are challenges and finding solutions is right now elusive.  Kathy Lussier addressed dig concerns about documentation and that ESI does a lot of the documentation work for new features but work not done by them is often left undone.  We had 380 commits since 2.4 with the biggest committers being Dan Wells, Ben Shum and Mike Rylander. Is that sustainable?  A rough guess i that those are half bugs and half features which is an improvement over the past.  Do we need to loosen review requirements?  Do we do leader boards as psychological incentive?  Concern that some would lower standards to increase numbers.  The decision about selecting a 2.6 release manager was put off as well deciding to let folks think about these issues more after we had a lot of discussion that lasted longer than we had planned.

Discussion also wandered into QA and automated testing.  A lot of progress has been made here since the conference.  In regards to unit testing there was a consensus that while it’s a great idea it won’t have a significant impact for a while.  Right now the tests are so minimal that they don’t reflect the reality of what real data does in complex real world environments and it will take time of finding those issues and writing more tests to reflect that before the work has it’s payoff.

Art.  Kinda looks like a grey alien to me.

Art.  Kinda looks like a grey alien to me.

I won’t try to re-capture all of the conversation but maintaining quality and moving releases forward were discussed in great depth.  There was less interest in discussing 2.6 than really trying to clean up and make sure 2.5 is solid.  The decision about who would be the 2.6 release manager was put off and the idea proposed for a leader board to encourage bunch squashing.  A “whackin” day to targeting bugs like Koha does was also floated about.

I spent a lot of the day looking at some great instruction Yamil Saurez put together for installing OpenSRF and Evergreen on Debian for potential new users and chatting with Jeff and Lebbeous about the need for beefing up the concerto data set with new serials and UPC records.  Other projects included looking at the web site, starting a conversation about users, merchandising, IRC quotes, and so on.  

By the evening we had a nice dinner and a group of us headed out to Founders for a drink and to walk about downtown Grand Rapids in order to look at Art Prize installations which were quite nice.

 

Evergreen Hack-A-Way 2013 Day 1

Note: this is not a comprehensive report, just my notes from my memory.

I’m writing this as I eat a waffle at breakfast on day 2.  Day 0 was Monday and folks gathered up at the conference center for dinner but it was Tuesday that things really started.  Starting at breakfast everyone was immediately in work mode.  Talk was heavily on the future of the staff client and the other big issues that we’ve all been waiting to see others in person to hash out.  We wrapped up grub and headed as a group to the Calvin College library who is kindly hosting us in their conference facilities.  Power was at every table and coffee soon appeared.  The wifi wasn’t perfect but we may have been pushing it’s limit.  And those are really the three critical needs of this crowd – power, wifi and caffeine.  And I found myself in a bit of an AV geek role hosting the Google Hangout and coordinating things with IRC a little (and I certainly wasn’t the only one multitasking back and forth so remote folks were involved).  

As we gathered (after laptops were setup) discussion immediately centered on the future of the staff client.  We discussed the issues with xulrunner.  Dan Scott noted that he had talked to Mozilla folks at a Google event and they were surprised at our use of xulrunner, noting that wasn’t it’s purpose.  Certainly newer versions cut off critical functionality and memory leaks are an ongoing concern.  With all of this in mind everyone was firmly in favor of moving forward somewhere.  

Ben, Kathy, Bill in the lobby Tuesday night.

Ben, Kathy, Bill in the lobby Tuesday night.

As we discussed where to go from xulrunner and what to go to, the discussion was web based client yeah or nay.  Although there were participants with a preference for a local staff client (specifically java based) the web based arguments took the day and those who had a preference for a local client were willing to support a web based client.  Discussion centered around using advanced Dojo and Chris Sharp worked on seeing how it would work with Evergreen 2.4 in the afternoon.  Everyone was concerned about the practical issues of how we could implement in stages a web based staff client, get testing and engagement as well with the community’s limited resources.  The consensus was we needed to live with the staff client for a while but move away from xul within It, come up with standards for the new staff client as a draft and if we can move to modern dojo and then in staff client interfaces might be largely portable to a web based one.  Concerns about how to handle issues that can’t be done in client and should they be done with a small local app versus plugin and best practices were discussed in regards to offline, staff client registration and printing along with various Windows OS concerns and authentication being maybe the trickest.  Offline with modern HTML5 was one of the lesser concerns.  Many words were also committed to how many and which browsers should be supported and although not absolute final answer was given most folks seem to agree upon supporting Chrome and Firefox by the community and that individual Evergreen members may support others.  

Collaborative notes were done by several parties and remote participation was good, both of which I was happy about.

After discussion we broke into groups looking at MARC export, web based staff client proof of concept, serials and mobile OPAC.  In between talking about the staff client I worked on some merchandising and web site issues for Evergreen (as well as handling some SCLENDS and York issues as they popped up).  

We worked until we were getting fairly punchy and broken to freshen up and head to dinner.  I ended up a great Thai place with good spicy food.  After that we ended up doing what I called the Hack-A-Way Lobby version with eight of us until I ran out of steam at 11:30.  Today is a new day.