post

Evergreen Conference 2016 and After Thoughts

Once a year I let some time pass from the Evergreen Conference before I try to capture my thoughts about it.  Finding myself in a contemplative mood this evening I finally decided to do it.

What should I write about?  The NC Cardinal folks did a great job, it’s an insane amount of work and they tackled it well.  There were a lot of great presentations.  The hospitality staff running the meeting rooms at the Sheraton were wonderful.  The Resistance was the game of the conference and I had a great time playing it.  As a member of the response team I was heartened that I was unneeded.  Honestly, I’ve been to far larger library events where they should make a model of the balanced, relaxed environment and professionalism of the Evergreen conference.  I had a great time meeting new folks at breakfasts and dinners.

My SQL pre-conference workshop went well.  One person told me that I really helped them with things they had struggled with.  Another told me they used the notes from my workshop last year during the entire intervening year.  Being told things like that make all the work worth it.

My statistics heavy presentation went well and I think I kept everyone awake even though by the end I had created more questions than I had answered.  I showed some clear relationships and likelihood of predictability of data if we can get enough data sets to compare and account for the variables influencing holds.  I think the data also clearly shows the value of sharing materials in a consortium.  I have a dozen thoughts on this that will be their own blog post at some point.

The biggest thing that stands out thinking back on it though is the lack of surprises.  In the early days of the Evergreen Conference I never quite felt like I knew what to expect.  Enthusiasm and passion for Evergreen are as strong now as they were at the very first Evergreen Conference but things have changed.  In the early days of the conference we had presentations about things like “How We Made Evergreen Work For Us.”  I stood at the front of the room doing a few of those myself.  Those are long gone.  The experiences, the presentations they reflect, for lack of a better term, are matured.  So has Evergreen.  So has the community.

We don’t have everything figured out but we’re not trying to figure out if we can manage the challenges either.

This weighs heavily on my thoughts because I saw an article that implied that open source software isn’t as mature as proprietary solutions.  I feel like the assumptions implicit were numerous and would take more time than I have here to deconstruct but again, might be a good future blog post or article.

Obviously, the perception of non-users of the software and non-members of the community doesn’t sync up with that of those who do use it and are members of the community.  I’m not saying my feelings are universal but upon talking to others I know they are widely shared.  So, why?

I believe Evergreen falls into a common pattern of technologies maturing.  Indeed, open source itself does.  Open source is a development methodology but it’s also a shared platform of technologies that build upon each other in chaotic way more akin to natural selection than design.  Why can people who see the adoption and maturation patterns of something like DVD players can’t see that it isn’t that different for software?  I don’t know.  Much like my consortial data presentation I feel like I’m leaving this with more questions created than I’ve answered but maybe that’s a good sign that I’m on the right path.

post

Sound and Fury: Choosing an ILS

I published this article a few years ago in Computers in Libraries.  Nothing in it will be revelatory for most open source advocates but at the time I got a lot of feedback from librarians that it was useful.  CiL’s exclusive publication window is long since past and I just thought of it the other day so I thought I would re-publish it here.  Four years later I still think it’s spot on though I would probably make some changes to either shorten it more or make it longer with practical examples.  


Few decisions cause a library director to fret more than choosing a new integrated library system (ILS).  A new ILS is expensive in money, staff, time and stress no matter what you acquire.  Additionally, the wrong choice can bear costs in morale with lasting consequences.  Sometimes it is easy to identify which ILS is wrong for you – the contract costs are too high or maybe the features are not present that you need.  But, too often selecting the right one is like going to a car dealership where everyone speaks in tongues and the price lists are encrypted.

This is the result of a decade of market disruption.  Once upon a time proprietary ILS vendors were not optional.  Picking the right ILS was fraught with danger but not conceptually difficult.  Two changes in the market have had an enormous impact.  One of these, the growth of applications as services has added new options to the ILS selection process.  However, it has been the growth of open source ILSes, such as Koha and Evergreen that have made it necessary to rethink the selection process. 

Choosing between an open source and a proprietary solution is not a choice between peaches and pineapples.  Frequently, it is assumed that the two types of ILSes cannot be evaluated by the same criteria.  In fact, they can be.  Although they result from radically different economic models and divergent philosophies, in the end both are products and services that can be defined by a library’s needs and resources for the purposes of acquiring.  Four major criteria must be compared – product cost, features, communities and support.  Until open source disrupted the ILS market one could safely ignore communities.  The community of a proprietary ILS product might have added value but it was unlikely to either make or break the selection of an ILS.  Now community plays a much more important role but that will come after we look at the other criteria. 

Perhaps the first thing to dispel is the myth that open source should be discussed as the cheap option. The wise library administrator will realize that while many of the best things in life are free, your ILS isn’t going to be one of them.  Your cost won’t always be in legal currency.  I have met staff so traumatized by a bad migration that they are still visibly shaken years later by what is now a stable and reliable tool.  The library has paid an ongoing price in terms of post traumatic stress and that cost can be too high.  The best migration is pointless if the library’s experience falls apart within a year or two causing the whole thing to happen again – an experience I’ve seen happen with both open source and proprietary.

Each of the four criteria can have multiple metrics as well as multiple vectors to plot them on for both a migration and on going support.  In the end a data set for ILS selection should probably look more like a scatter chart than a report card. Now, can we simplify the process?  The answer is yes.  A detailed consideration would be worthy of it’s own book but by taking a few conservative shortcuts we can sketch a road map for your selection process in a period of time that isn’t comparable to earning another master’s degree.  For example, we will assume that you have the same vendor handle a migration as ongoing support.  This will not be a road map for the adventurous.  This is for those whose boards will require that all core functions work on the day of go live with minimal surprises.  And being conservative does not exclude you from an open source solution.

First, do a needs assessment.  This is the point at which many upgrade processes fail.  Rather than say something like “we need acquisitions” or “we need EDI” do use cases and narratives.  These should create an unambiguous picture of your needs.  Be careful to not attempt to recreate your existing ILS.  This is the point at which libraries realize how deeply embedded their current ILS is in their operations.  The documents you produce at this stage will be used extensively in working with vendors.  Be honest about what you need and what is merely on a wish list.

Now, find your vendors.  Don’t even worry about the ILS itself yet.  That may sound heretical in an ILS selection process but you need to safeguard from fixating on a single product and not evaluating honestly.  Fixate on your needs instead. Some vendors will support multiple ILSes and at this stage you are looking at who can provide support during a migration and ongoing.  Look at each vendor’s ability to support hardware, provide reliable access, and expertise with the ILS, their ability to find solutions, their training resources and expertise at setting a system up.  Do yourself a favor and look in depth at their experience with data migration – it is surprisingly hard to do well.  Do not let a vendor make vague promises about your data.    Looking at vendors before solutions may seem to be putting the cart before the horse but in the long run the greatest frustration most libraries have with an ILS doesn’t stem from software but support.  At this point you should also rank yourself as a vendor to see if you want to fill some of these support roles yourself.  Be honest about your ability to sustain support.  Many libraries begin projects that falter when key personnel leave because the skills are not part of the institution.

Many open source advocates argue that support is an inherent advantage of open source.  Some libraries delay leaving an ILS they are unsatisfied with the support of due to the stresses of migration.  If you use a vendor for support of an open source ILS they cannot lock you into the ILS itself.   Once your contract is up, if you leave a support vendor and extract your data you can import rather than migrate your data into a new system.  That ease of changing support vendors without changing software means that open source support companies have to compete on the basis of support because the threshold of difficulty for the library leaving is reduced by orders of magnitude.

The next step is to define what kind of support contract you want.  Do you want a local install with minimal support, do you want local with remote administration or perhaps an application as service where you sign a check and everything is just made to happen.  At this point evaluating yourself, as a potential vendor, will help you determine if you want to exclude yourself.  A product supported fully by a reputable vendor with skilled support staff is what you’re looking for.  Increasingly, the choice most libraries make is buying an application as service.  Vendors can take advantage of high capacity Internet connections and big virtualization systems to achieve economies of scale and offer remote hosted ILS services much cheaper than a library can locally offer it.  But, you may have factors, such as response times needed, which make a local installation more attractive.  Knowing what kind of support contract you need you can begin looking at the packages offered by the vendors and dramatically simplify the rest of the process.

Next, make two lists to look at support and features separately.  Vendors need their feet put to the fire to answer if they can fill your needs, which is why the use cases and narratives are critical.  Find out what the vendors’ uptime guarantees are, what their response times are and what tiers of support they offer.  For example, do they handle user interface level troubleshooting, will they do custom development to solve issues, or do they simply do systems administration?  What services do they offer during the migration?  Can they extract your old data?  Can they offer project management or training?  Will they offer documentation?  Now, some of these resources may originate in part or whole from a community but at this point worry about the availability through the vendor and their obligation to you to make it happen.  List the support levels of each vendor.  Go back to Buying An ILS 101, call references and do every other thing you would do with any big-ticket purchase.  

Parallel to support, review the ILSes themselves and isolate what software will be viable for you.  Make sure the vendors support those features and how you want to use them.  I’ve had clients spend a lot of time preparing to move to a system only to have their vendor say, “we don’t support serials” even though the ILS has the functionality.  Return to those use cases and narratives your staff developed earlier.  While sharing the use cases get a detailed analysis of what your narrative experience using the ILS(s) will be.  If they can build a comparative scaled system (number of patron, bib, copy records, etc..) for you to test against this is critical for applications as services.  More than one library has been burned by not seeing their data run at scale and not doing hands on features testing.

Think about future features too.  Will there be things you can’t anticipate or live without?  Will the new social network that everyone gushes over be critical two years from now?  Can the company you are working with provide you with development options?  If not, then open source may provide you with other kinds of development paths depending upon the community surrounding it.  What about wish list features?  Maybe like William Henley you want to be the captain of your own soul, or at least ILS.  Ask yourself if you want to make changes to the software and control those changes in the future.  If the answer is a firm yes, then you probably want an open source ILS and will need to allocate resources for development.  Don’t automatically discount a proprietary vendor but giving you that control is not usually a part of their business model.

It is also worth asking if you want to be part of a consortium.   Although really large resource sharing consortiums aren’t unique to open source they do seem to be more common with growths in the Evergreen community, like SCLENDS.  Materials sharing may or may not be on your agenda but adding to an existing installation has a lot of advantages including a built in local community to draw on.  

Since applications as services are delivered over Internet connections it is important to know the impact they will have on your connection.  Prolonged profiling will tell you when you may have interruptions in service and what delays in response time you may have.  Map obscure phrases like “ping times” and “drop rates” to real measurements like “it will take 2 seconds to check out an item.”  Often time, work flows can be adjusted to handle the increased latency from moving an ILS from inside your network to remote hosting but an unexpected impact like that can heavily damage morale.  This is a time to bring in heavy-duty network expertise and make sure they go over issues with a fine tooth comb.

Finally, we get to every library’s least favorite topic that isn’t protected by confidentiality laws: budgets.  Take those support options and the ILSes by the vendors you find acceptable and map them against how much you have to spend.  Any that you can’t afford, toss.  What you’re left with is ILSes that will work for you, companies you can trust to support you and an experience you can afford.  Be wary of rushing into support contracts for applications as services though.  Compare your costs across the lifespan of the longest contract you would have to sign – which should be three years.  A longer contract than that which locks you in should be a concern.  Make sure there are guarantees about maximum rate increases and reasonable rates for extracting data. 

Many an ILS has been chosen because the library administrator feels overwhelmed.  An implementation by the current ILS vendor can seem like an easy and safe choice. That is a poor assumption to make.  In the course of development or corporate acquisitions sometimes the upgrade path defined by a vendor is actually to a whole new product.  When that happens an upgrade is really a migration.  So, donÕt be deceived by the potential level of difficulty of the project.  Vendors like to define upgrade paths because they know many local governments provide clauses that allow organizations to upgrade without going through a competitive bid process.  ThatÕs also how you can get stuck doing two migrations in two years Ð something no one wants to do.

At this point you may be ready to select an ILS but you should take one more step.  So far we have flattened out the modern twists to ILS selection and used a model built on common sense.  The next criterion is not a leap into uncommon sense but it is much harder to define.  Evaluating community requires the administrator to understand how their staff as professionals will interact with a larger community rather than perform workflows.  Community is not an open source specific criteria though it might be more central to those ILSes.  The communities of proprietary ILSes can be hampered or facilitated by the corporation linked to the ILS.  Open source ILSes are built by their communities but may still have large corporate presences.  When evaluating those dynamics don’t frame the discussion as business versus community, as that’s a false comparison.  Evaluate the businesses as members of the community by their actions and consider that when developing a picture of the whole community.  

As you investigate vendors, how they interact with communities might tell you something about the character of the company.  Does it allow for independent user groups and conferences?  Are there emails lists and public forums?  Are there places to share and ask questions of others who use the ILS?  Some proprietary ILS vendors have encouraged these things and allowed outside repositories for documents.  In open source communities these are the norm.  Do not underestimate the value of community.  Not only do ILS communities help you make the most of one of the largest pieces of your infrastructure but also an active engaged community can be invaluable for the professional development of your staff.

Look at your resources and ask if you are the kind of organization that is ready to be part of a larger community or if you prefer to play alone on your own ball field.  Sometimes it is the larger libraries that are less prepared to be strong community members because they are accustomed to making decisions as an independent entity.   In the end you may choose to leave community out of your considerations for an ILS selection. However, at least some awareness of the larger community should always be there, to compare experiences with a vendor at the very least.  If you are using an open source ILS vendor and you successfully vetted them they should be involved in the community and may be a gateway to you becoming involved in the future if your priorities change.

At this point you have decided on the viability of a given ILS migration and looked at communities as added value.  Do you need a tiebreaker query?  If you do look at what your gut tells you.  The truth is that for all of our development as a species we still sometimes process information subconsciously and have gut instincts that lead us well. Do you have a philosophical leaning towards open source?  Does one vendor click as a partner? 
 
If you are willing to endure some hardships you can play fast and loose with this process.  Risks can pay off but it’s a luxury most boards don’t give their directors.  Open source succeeds where it is the best solution, not because of philosophical biases just as commercial software succeeds when it does so on quality, not on spreading fear, uncertainty and doubt about competition. As we look critically at these solutions, their vendors and communities we also have to look to the future.  Mark Twain said a sure way to look a fool is to try to predict the future.  But we need a sense of how the future of these ILSes will unfold since we will be tied to one once we make that selection.  Communities and companies can be filled with amazing people who can make all the difference, and they can fall apart.  Engagement with partners in companies and communities are where we will the future unfolding.  We need to remain aware of these dynamics – they are often why we end up moving to a new ILS after all.

Evergreen Conference 2015

Another annual conference has come and gone.  2015’s seemed short.  It wasn’t, in fact, but the time seemed to go quickly.   I admit I’m not as good a traveler as I once was.  Part is due to age, and part is routine.  I always wake up the same time of day.  Even daylight savings causes me difficulty adjusting.  Flying to a different coast and telling my body to adjust three hours is nearly impossible.  Then five days later I do it in reverse.  In between I’ve run nonstop for days on end.  I interact with people and burn every ounce of introvert energy I have.  I also run on little sleep, staying up late and getting up early.  And it’s worth it.  By the time I drag home I feel a bit like Toshira MIfune in Yojimbo, crawling exhausted and battered under the house hoping to just get away from everything and recuperate.

But, I do it each year because it’s worth it.  This year I taught a half day SQL workshop, I served on a panel welcoming new folks to the Evergreen community and I did a presentation on extending data sources in the reporter.  You can see all that from the conference schedule.  But it’s far more than that.  What I gain isn’t entries on a vita.  Even a few years ago attendees of my SQL and Reporter workshop would have been staff with very tech oriented roles.  The sessions were full with 20+ people in each and they were librarians.  Yes, tech curious but by no means systems administrators – traditional librarians who want to dig deeper and deeper into the power that Evergreen can provide.  I like to think I helped make some materials more accessible to them and the fact that this new power user class is growing in the community is a wonderful thing.  That additional depth and breadth in the community is a healthy thing.  It means that the idea of a tech curious librarian is increasingly irrelevant.  Every year that I use a phrase like that it sounds sillier and sillier and I’m happy for that.

And to paraphrase Billy Shakespeare, the community is the thing.   Attending the reporting interest group I talked about the need for new core reporting features with staff from all over the country (and world), about the need for existing and new libraries.  I think we need to bring back core reports into Evergreen as something that is expanded and tested with each version and it’s something I hope to work on this year, starting with going through the ones that were developed for 1.6 and updating them.  I talked with folks from Indiana about homebound services and something vaguely (but not quite) like plans were made. But talking is a starting point.  I talked about philosophies and their practical import.

Many bad jokes were made (by me) and a few good ones (not by me).  We compared war stories, planned for the future, discussed what ifs and shared discussions about the meaning of life, or at least governmental ethical obligations and spending regulations.  Talking to other consortiums is always illuminating.  So is playing board games late into the night (I won Stone Age but it was a bit unfair since most of the others hadn’t played it before).

I left after three years on the Oversight Board having completed a three year tour of duty.  Several folks expressed surprise at my tour ending.  Three years go quickly.  A few also asked why I didn’t run again.  The truth is that we instituted the format of rotating members off the board so that it wouldn’t become stagnant.  Our community is large and diverse.  I want to let new voices in.  I may run again in a year or two.  I may not be able to vote but I’m still around and I promised to come in and sit in on meetings when time allows.  I also agreed to remain on the merchandising committee and to assist the board with some special issues if they come up.

I’m also making some changes to the Hack-A-Way.  Submissions are now open for 2015 and will remain open until June 19th.  However, we are moving to an annual model for the Hack-A-Way.  With it now in it’s fourth year it’s become an institution.  As the kickstarter of it I still think of it as a scrappy little thing that has to prove itself so seeing folks planning far in advance and competing to host it surprises me.  But, it shouldn’t.  I myself have pointed out the good work that has come out of it each year.  So, a year wraps up and I head home to recuperate.

I had to leave before the developer update was done but I know the gist already.  The new staff client looks amazing.  I would be tempted to say that we should do a second (unusual) upgrade in 2015 but with so many other projects on our plate it’s probably not in our stars.  And maybe it’s best to just go over to the new staff client all at once anyway.  The new infrastructure also opens a lot of new doors I think.  But all that is left behind as I fly back to the east coast and just worry about getting gate to gate.

Today I returned to work, jet lagged and exhausted.  But in a way the conference lingers, it’s effects reverberate in strange frequencies and conversations will continue on in IRC and by email for weeks and months to come.  Really, we think of the conference as a distinct moment in time but it’s more of a peak of a sine wave that goes on and on.

Hack-A-Way 2014 Wrapup and Photos

I’ve uploaded my photos from the Hack-A-Way to a gallery on my site and (more importantly) the Evergreen community Flickr account.  See them along with somewhat lame commentary here:

https://www.flickr.com/photos/evergreen-ils/

You’ll notice in the photos a lot of people quietly typing.  There was discussion but the nature of a hackfest is a lot of collaboration and coding.  And this hackfest had coding, tutorials, documentation and more.  Indeed, many people came in with things to talk about, things to resolves, things to learn and things to work on together.  It was great.  One person said they wished we could do this several more times a year.  That’s probably not practical but the fact that it gave that feeling of being useful made me feel good.

It was a good but exhausting week.  I started with picking up materials the Saturday before and it just went on from there.  This isn’t to say that I did it all.  Other staff, at the York County Library, were critical to pulling this together and although their roles were sometimes invisible to participants, trust me when I say that everyone appreciates each of their efforts immensely.   For me it wrapped up just a few hours ago, a week later, dropping off a few colleagues at the airport and doing this blog entry.  

Several people commented on how productive it was and big progress was made on several fronts. 

http://wiki.evergreen-ils.org/doku.php?id=hack-a-way-2014

The Evergreen wiki page and linked collaborative Google Doc outline a bit of what happened.  I also tried to highlight some of the more offbeat moments on Twitter under the #egils and #hackaway14 hashtags.  Well, at least the PG rated events.  The exact language a few points may not have been copied verbatim.  I think that would have raised it to PG-13 in one or two cases.  And I do regret not getting the beat boxing on video.

We didn’t fix the entire world’s (or even all of Evergreen’s) problems but we made progress.  We looked at Evergreen issues and compared issues with specific installations.  We talked about big picture issues that affect the future of the community.  We groused, we pontificated and just shared opinions.  And we ate BBQ. 

We talk about community in open source a lot but when we talk abstractly it’s about faceless sources of email and git commits.  Events like this, even more than the conferences, bring home how human that community is.  I’m lucky in that I like these humans.  I like spending time with them and like working with them but it still makes for a very long week. 

I learned a lot of new things this year that I hope to put into practice over the next year and soon enough #hackaway15 will start it’s own planning process.

 

 

Hack-A-Way 2014 Day 1

I’ve been organizing the Hack-A-Way for three years, since it began, but this year it’s come to my own library in Rock Hill, SC.  SCLENDS has been active in the Evergreen community as much as our resources could allow from the beginning and this has been the first time we’ve had an Evergreen community event in South Carolina.  While I’ve been both happy and proud to host it myself this year it also reminds me of how much effort past hosts (Equinox, Calvin College) put into it.  We’ve learned each year from it and it’s evolved.

While Hack-A-Way was originally conceived of as a two day event with a “pre” day like a pre-conference I think it’s time to simply change that idea to a three day event with the acknowledgment that some people may arrive at various points during the first day.  I’ve also in the past not started looking for hosts until after the annual conference.  As it’s been a fairly low key event with a small group of technical members of the community I didn’t see it as needing a lot of lead time.  Of course, the numbers of attendees has grown (though not dramatically) and the standards for hosting have been raised by the first hosts.  Now, I think I will start looking for hosts earlier, maybe as soon as when this one is over.

We did a lot on the first day, alternating between group discussion and working together on small projects.  We attempted to extend our remote participation via jit.si but tomorrow will fall back on Google Hangouts.  Tragically, love for FLOSS projects sometimes has to bow to effectiveness.  And, as usual, we use IRC.  Some of the topics can be found at the Evergreen WIKI at http://wiki.evergreen-ils.org/doku.php?id=hack-a-way-2014 where you can also find the working Google Doc we are taking notes at though more happened not quite captured there.

You can also follow along on twitter using the hashtag #hackaway14

And now, the day, in brief, in pictures,

It turns out that a bunch of developers and Linux admins are the wrong people to troubleshoot Windows.   It turns out that a bunch of developers and Linux admins are the wrong people to troubleshoot Windows.  “Charms bar?!?  Is it really called that?” was said at one point. I didn't trust the wireless so I provided a gigabit switch with plenty of cables. I didn’t trust the wireless so I provided a gigabit switch with plenty of cables. Do you trust the future of your ILS to these guys? Do you trust the future of your ILS to these guys? Let's backport that, what could go wrong? Let’s backport that, what could go wrong?

 

post

SQL for Librarians

Here it is, SQL for Librarians.  I closed out the Cambridge Evergreen Conference (for good or ill) and actually keep a few folks there until 12.  I had a lot of great comments so I think it was fairly successful despite being a tad loopy from allergy medication.  And I blame the medication for a few things that upon listening to this I cringed at.  In a perfect world I’d love to do this again and do it with a full workshop format.  

Slides: http://www.slideshare.net/roganhamby/sql-for-librarians

Youtube: https://www.youtube.com/watch?v=3Iz-HFiDq6E

Conversations At ALA About ILSes

Normally I leave my rabid pro-FLOSS pro-Evergreen attitude for the web.  In person I make a conscious effort to not be so forward as it’s usually a hinderance to meaningful conversations.  Today at ALA in Vegas I threw that rule out the window.

I didn’t do it right away but eventually I was worn down.  Worn down by what you ask?  Since this morning, I’ve had six conversations with people bitching about their ILSes.  And their complaints were legitimate.

“I went to Blue ILS years ago and it was great but the founders left and they’re now evil corporate sociopaths who abuse us regularly.”

“I was about to go to Red ILS which is great with great support but they just got bought out by evil sociopaths and I don’t feel good about this anymore.”

Valid concerns.  What annoyed me was the fatalism.  “Whatcha gonna do?”  Go open source.  There, I solved it for you.  I told the last one that in those terms.  I usually say it anyway but with more respect for the difficulties their situations face.  But I’m tired of having those issues used as excuses for why libraries should allow themselves to be abused.  The difficulties make things non-trivial, maybe even hard, but not impossible.  And it is the answer.  If you’re not being abused, you just wish it was better and you’re willing to live with it because you have higher priorities then that’s fine.  But if your voice sounds like you’re beaten regularly when you talk about your ILS vendor  … yeah, you need an intervention.

So, how do you do this?

Well, you could host yourself in which case you only have to trust yourself.  But, that may not be efficient.  I use hosting from Equinox Software.  My hosting and support provider has the advantage of expertise from hosting many installs and economies of scale.  Why do I use Equinox? Because I trust them.  Why do I use Evergreen and open source?  Because I don’t have to trust them tomorrow.

Implicit in the complaints is lock in – whoever they go with for support owns the software.  Changing support means changing software which is a huge deal.  But when no one owns it, everything is different.  My contract allows free access to my data.  If the leadership changed at Equinox I would just change service providers.  My users won’t know.

And yes, that’s why open source is the answer.  There’s no reason for anyone to ask me if I’m happy with my ILS support because if I wasn’t, I’d just change it, year to year if I had to.  And that’s a very good thing for my library.  

Sharing is Good Business

I was thinking today about intellectual property in the library world.  Specifically, what prompted my musings was Elon Musk’s blog post about patents and the Tesla Motor Company last week.

http://www.teslamotors.com/blog/all-our-patent-are-belong-you

Already the global news cycle has come and gone on it.  But, I think it’s interesting to think about the obligations of those interested in shaping social change and how intellectual property plays a part in that.  Next week I will be at ALA and one of my favorite parts of a large event like ALA is looking at the vendor floor filled with businesses eager for my library to accept an invoice.  But the question is, are they people I want to do business with?

Let me back up a bit.

Elon Musk has stated that his goal isn’t merely to build successful businesses but to push the world forward.  And he’s now realized it has to inform how he as a capitalist interacts with and shares with others.  He wants to help create a common baseline any manufacturer could build a vehicle off of.  Should all businesses have an obligation to help move the world forward rather than just their profit margins?  Is there a possibility of one day developing a core set of freely shared technologies that anyone could build an ILS from?  Oh, wait a minute, that’s already being done …

Stepping back a bit more, the industrial age was dominated by the development of technologies that allowed goods to be made in greater precision and volume than ever before.  Often the knowledge of how to do this was freely stolen.  Note, I don’t say they were shared but once a competitor acquired the knowledge of how to do something there was little going back.  And I’m not saying that this was good but it was an aspect of an age of expansion.  We did socially reap benefits from information being distributed, even illegally. 

The information age finds us making goods out of information itself.  Never have we been so well prepared to defend intellectual property.  As a society we litigate – comprehensively and aggressively.  Maybe we instinctively hoard information because we know it is valuable.  And libraries, entities who should be at the forefront of sharing, who make our very existence off making information available to our patrons, are as guilty of not sharing as anyone.

Fortunately, that is changing.  OCLC is embracing the Open Data licence and is encouraging their members to do the same, which is a wonderful thing.

Open Data Commons Attribution License

I would love to see this go further into institutional data far beyond what is collected on the state and federal level.  I periodically find archives being loaded on the web by libraries using some variant of the Creative Commons licences, also a good thing.

When we share, everyone wins

And finally, a few libraries are embracing Open Source.  The licences vary by project but the heart of all the licences is allowing new tools to be built on existing ones.  And there are many small projects like libraries to handle data types or protocols but those are building blocks.  Musk realized that he had to start sharing how you stack the building blocks – how you make the big stuff.  Are we doing that?  Frankly, Koha and Evergreen are pretty big things so companies involved in improving those are already building, and changing, the future.  Opening ILSes, making them freely available, giving powerful tools to everyone regardless of income, valuing knowledge over money, these things change the world if they gain enough adoption.  Don’t believe me?  Look at Linux, Apache, PHP, MySQL, Postgres, Perl and so on.  If you think widely adopted open source products haven’t changed the world you live in a state of denial.

Musk realizes that companies need to change the world as there is a role there no one else will fill.  His businesses are means of supporting positive change as well as generating profit.  Being open is not anti-capitalist, it’s an adaptive strategy for a changing world.  He’s invoking the ideology of the FLOSS movement in his blog entry even if he is not participating in it.  The simple act of adding to the base of freely used, functional, technology gives the future a deeper toolbox with each contribution.  That is a morally virtuous act that he wants to align with even if he can’t adopt it due to the nature of the patent system.  But it’s not the moral element that fascinates me about Musk’s entry, it is the implication he makes that by opening access to technologies protected by his patents, essentially vowing to not pursue his intellectual property rights, he is declaring that is the ethical mandate of his company to share.  To rephrase and repeat, like any good reference librarian, Elon Musk is saying that profit is not the sole ethical mandate of his company. 

I would call out library corporations to look at what they can open source or at least share by some means.  I would say that library vendors who don’t share where they reasonably can are acting immorally.  Note, I don’t say unethically, that is an entirely different matter, determined by their own corporate structure.  In fact I worry that they have an ethical obligation in their roles to do immoral things.  

What can they share?  I don’t know.  Clearly it’s not realistic for an ILS vendor to GPL their entire codebase and dump it on Github.  But I find it hard to believe there isn’t anything they can share.  Maybe a library of code for RDA checking.  Maybe a Z29.50 server.  Maybe a network diagnostic tool.  Maybe data about usage needs.  

Elon Musk’s blog post raised eyebrows because he rejected the idea that hoarding information is a business’s ethical obligation.  We already have vendors who support open source and believe that sharing is an ethical requirement to being in the library community.  I think libraries should hold vendors to that standard.  I want to support companies who act in a manner I think is both ethical and morale in regards to supporting not only my library this fiscal year but the libraries of tomorrow and open source is a big part of that.  

Hack-A-Way 2013 Day 2

day 2

I should have said this on the outset of yesterday’s post – Hack-A-Way 2013 hosted by Calvin College and sponsored by Equinox Software.  I have no obligation to mention those in this forum but they both deserve the recognition (and far more). 

Priority one for day two was finding out how to hack hangouts so that my typing didn’t mute the microphone (which they couldn’t hear anyway since I was using an external microphone).  Some quick googling uncovered that this is a common complaint by people who use hangouts for collaboration and that there was a an undocumented tweak that only required minimal terminal comfort.  I’m still tempted to get a second laptop to make it easier to position the camera though and I’m definitely bringing the full tripod next time.  But, AV geekery behind me …

We started with reports on the work the day before.  

Ben Shum reported work on mobile catalog.  That group was the largest of the working groups and had laid the ground work of saying that it should have full functionality of the TPAC and that was the goal.  The team worked on separate pieces and working on moving files into a collaborative branch on working repository.  A lot of the work is CSS copied from work done by Indiana as well as de-tabling interfaces and using DIVs.  

Our table worked on a proof of concept for a web based staff client.  Bill Erikson had previously done a dojo based patron search interface and checking out uncataloged items as a proof of concept.  We worked on fleshing that out, discussing platforms for responsive design, what would be needed for baseline functionality (patron search, checkout, see items out, renewals) and then later bills.  This is less a demo at this point than a proof of concept but one goal is to have something that might in a very limited way, with some caveats, also help those suffering from staff client memory leaks by having something that could handle checkouts without the staff client.  It is also bringing up a lot of conceptual questions about architecture of such a project.  Working directory and dev server are up.  Most of the work on this is being done by Bill and Jeff Godin with input from the rest of us.  

Lebbeous Fogle-Weekly reported for the the serials group.  They targeted some specific issues including how to handle special one off issues of an ongoing series and discussed the future direction of serials work.  In fact they already pushed some of their work to master.  However, because of their narrower focus they are going to break up 

Jason Stephenson worked on the new MARC export and has a working directory up.  New script is more configurable.  At this point I missed some of the conversation unfortunately due to some issues back home I had to deal with but apparently in a nod to Dan Scott MARC will now be MARQUE.  

In evaluating the 2.5 release process we spent a lot of time discussing the mostly good process and the big challenge the release manager had in it.  The community goal has been in making more stable releases.  During this release Dan Wells added more structure was good, the milestones, pointing out bugs was good but he also wanted feedback which was really hard for the developers who were very happy with his work.  But there are challenges and finding solutions is right now elusive.  Kathy Lussier addressed dig concerns about documentation and that ESI does a lot of the documentation work for new features but work not done by them is often left undone.  We had 380 commits since 2.4 with the biggest committers being Dan Wells, Ben Shum and Mike Rylander. Is that sustainable?  A rough guess i that those are half bugs and half features which is an improvement over the past.  Do we need to loosen review requirements?  Do we do leader boards as psychological incentive?  Concern that some would lower standards to increase numbers.  The decision about selecting a 2.6 release manager was put off as well deciding to let folks think about these issues more after we had a lot of discussion that lasted longer than we had planned.

Discussion also wandered into QA and automated testing.  A lot of progress has been made here since the conference.  In regards to unit testing there was a consensus that while it’s a great idea it won’t have a significant impact for a while.  Right now the tests are so minimal that they don’t reflect the reality of what real data does in complex real world environments and it will take time of finding those issues and writing more tests to reflect that before the work has it’s payoff.

Art.  Kinda looks like a grey alien to me.

Art.  Kinda looks like a grey alien to me.

I won’t try to re-capture all of the conversation but maintaining quality and moving releases forward were discussed in great depth.  There was less interest in discussing 2.6 than really trying to clean up and make sure 2.5 is solid.  The decision about who would be the 2.6 release manager was put off and the idea proposed for a leader board to encourage bunch squashing.  A “whackin” day to targeting bugs like Koha does was also floated about.

I spent a lot of the day looking at some great instruction Yamil Saurez put together for installing OpenSRF and Evergreen on Debian for potential new users and chatting with Jeff and Lebbeous about the need for beefing up the concerto data set with new serials and UPC records.  Other projects included looking at the web site, starting a conversation about users, merchandising, IRC quotes, and so on.  

By the evening we had a nice dinner and a group of us headed out to Founders for a drink and to walk about downtown Grand Rapids in order to look at Art Prize installations which were quite nice.