Library 2.0 Debased

Kate Sheehan makes some interesting observations about the cultural awareness of librarians. She also touches on an unfortunate truth about Library 2.0:

It’s easy to become enamored of social networking sites and Web 2.0 toys to the point where they seem like a panacea for everything that’s wrong with your library or your job. Slap a wiki on it and call me in the morning. The most successful uses of the newest tech tools have recognized that they’re just that: tools.

I’ve been feeling, for awhile now, that the term Library 2.0 has been co-opted by a growing group of libraries, librarians, and particularly vendors to push an agenda of “change” that deflects attention from some very real issues and concerns without really changing anything. It’s very evident in the profusity of L2-centric workshops and conferences that there is a significant snake-oil market in the bibliosphere. We’re blindly casting about for a panacea and it’s making us look like fools.

Ignoring the information ecology

Perhaps the most significant area of neglect is our failure to recognize that Library 2.0 is a delicate ecology. Like Web 2.0, it represents technology that is inherently disruptive on many levels. Not only does Web 2.0 undermine notions of authority and control, but its economic and human costs are very real. There is, indeed, something very exciting about the fact that Google bought YouTube for $1.65 Billion–especially since it was only a company of sixty-odd employees. But at the same time, I’m a little alarmed that sixty-odd people could dominate such a large piece of that market-share. Not for the same reason that we have (soon to have had) the FCC’s media ownership rule, but because the force of that type of change has to be felt somewhere. Think of it in terms of a bag of nitrate dumped in a stream–the algae does really well, but the fish suffocate.

Luckily, Web 2.0 as a whole exists in a large, rather well-insulated economy that will adjust over time. Libraries, on the other hand, are significantly more delicate ecosystems that require more care and discretion. Specifically, we need to understand how our internal information ecology works and how to tend to it. How and where we interface with our users is where the rubber meets the road and should merit a little more thought then simply thrusting a MySpace page in their face or building a new library in Second Life–a service our users overwhelmingly do not use and, which seems to me, like a creepy post-apocalyptic wasteland. I’ll even turn the tables on myself and admit that I was wrong about local tagging in the OPAC. SOPAC was by-and-large a success, but its use of user-contributed tags is a failure. For the past nine months, the top ten tags have included “fantasy”, “manga”, “anime”, “time travel”, “shonen”, “shonen jump”, and “shape-changing”. As a one-time resident of Ann Arbor, I can assure you that these are not topics that dominated the collective hive mind. Well, maybe time travel, if hash-bash was going on.

So we need to understand that, while it’s alright to tip the balance and fail occasionally, we’re more likely to do so if we’re arbitrarily introducing technology that isn’t properly integrated into our overarching information framework. Of course, that means we have to have a working framework to begin with that compliments and adheres to our tradition of solid, proven librarianship. In other words, when we use technology, it should be transparent, intuitive, and a natural extension of the patron experience. If it can’t be transparent, then it should be so overwhelmingly beneficial to the user that it is canonized not by the techies, but the users themselves.

You can’t buy Library 2.0

…And vendors, you can’t sell it. But that doesn’t mean it won’t be attempted. I think perhaps there is an expectation that real-life should somehow mimic the success of the software plug-in model. There may be something to be said for the “object-oriented” library, but that is a far cry from stuffing a new product into an already-awkward, malformed, and ill-suited portfolio. For example, third-party OPACs, as they are currently being sold to us, are likely to fail. Not because they are inherently bad products–some are, some aren’t, but because the companies producing them are only mimicking the Web 2.0 widget–the deliverable. What they are not doing is reevaluating their business and development processes with the goal of realigning them with the interests of libraries. I discussed the pressing need for significant development partnerships back in the July 2007 issue of LJ’s NetConnect and I still believe that that particular model for collaboration is the only way to significantly improve our ability to embed technology in the library. It’s not a long-term viable solution to sell the concept of development partnership when all it really is is just the opportunity to report bugs on software that is not quite ready for prime time.

As libraries, we need to realize that the answers to our larger questions cannot be found out on the exhibitor’s floor. That’s where we find solutions to specific needs that have been identified by a thorough self-examination.

Meeting technology half-way

Don’t hold your breath waiting for technology to adapt to the library environment. Web 2.0 did not evolve with libraries in mind, and there’s no reason to think that it ever will. I realize that, at first glance, that statement seems to run counter to what I’ve been saying with regards to not forcing a square peg into a round hole. What I mean is that we cannot expect to retrofit our libraries with tomorrow’s technology. The true pursuit of Library 2.0 involves a thorough recalibration of process, policy, physical spaces, staffing, and technology so that any hand-offs in the patron’s library experience are truly seamless. We can learn a lot about collaboration and individual empowerment from Web 2.0, but we cannot be subsumed by it because we have a mission that eclipses “don’t be evil” which is the closest thing to a conscience the Web will ever have.


About this entry