On April 18th, I’ll be participating in a panel discussion at the 2013 Annual Meeting of the National Council of Public History, entitled WordPress as a Public History Platform. The panel is described in the meeting program [PDF] and on the conference blog if you care to read about it. If you’re attending the conference, I hope you’ll participate. I’m always excited to talk about WordPress. I use it for projects all the time and am really looking forward to the panel, where I think we’ll be able to draw out some of what makes WordPress a good choice for Public History projects. But as I was preparing some notes, something occurred to me: the key question I hope public historians and other scholars will be asking themselves in 2013 is not actually “How can WordPress help me make successful projects?” That’s a good question and worth discussing, but it’s not actually a first order question. The more fundamental question is “How can I improve the value of my project by understanding Public History as a form of web publishing?”
Conferences are a great place to sell people on tools℗ and solutions©. This is not necessarily a bad thing. Sometimes real problems can be solved with a simple change of perspective, a little money, or a the perfect new tool. Being told which tool to use can be a huge relief for people whose interests and expertise lay elsewhere. But sometimes this focus on the solution prevents a more fruitful discussion about the problem itself because it assumes the problem is already well-understood.
In the case of Public History and public scholarship in general, the problem for many scholars is that they don’t yet understand how to “do digital.” Some scholars view the digital/online aspects of publishing as something wholly separate from scholarship, letting someone else eventually maybe put some portion/version/representation of their work on the web. Until relatively recently, especially for scholarly content, there were no well-known precedents or best practices for putting content on the web. And the web was not where most people started producing or consuming. Thus, one could still afford to think of it as putting content on the web. But it’s 2013. The web cannot be where content ends up; (most of the time) it needs to be where it started. This means thinking differently from the outset. It means taking a more comprehensive view that incorporates not just intellectual content but also distribution and consumption apparatuses, present and future. In short, it means rethinking public scholarship as a subset of web publishing.
Some may feel this is the job of actual publishers and that individual scholars need not concern themselves with such matters. This reaction may well be correct – if not for all fields or scholars, then for many. But for a great many Public History projects, there are no actual publishers, only de facto ones. Public History frequently means addressing the audience directly. It means creating our own digital and print material. It means editing our own copy, doing our own promotions, negotiating our own budgets, acting as video and audio producers, hiring (or acting as) our own developers and designers, iterating on ideas and implementations, and evaluating risk and success. Sounds a lot like publishing to me.
The problem is, while public scholarship often involves activities that are clearly in the realm of web (and other forms of) publishing, not all scholars understand or embrace the relevant standards and best practices. And again, not every scholar needs to understand this stuff. But for those of us working mostly or entirely online, thinking about process and output in terms of publishing can be hugely beneficial.
The following is by no means an authoritative or comprehensive list of issues in web publishing. I’m not the one to provide such a list. This is just a few random thoughts meant to begin a conversation.
Project Design
Project design can be a nebulous and ongoing process. Good project design takes into account not just the desired outcome of the project, but also the steps it will take to reach completion, and the steps that come after completion (revision, iteration, preservation, etc). During initial project design one should define explicit goals and boundaries, set timelines and periodic milestones, research available platforms, and understand management challenges (collaborative content creation, central file storage, documentation).
Answer as many questions up front as possible. Is your project output going to be evidence-based or interpretive? The CMS you choose might be different in each case (Chad Black has a thoughtful blog post exploring this issue). Do you have – or need – a long term preservation plan? Do you need a separate site or database for project management and documentation? Could your website evolve into a mobile app? Do you need to purchase or develop custom software? Could your content be transformed into an ebook? Should the project have an API so that others can extend and re-use your data? What about microformats and linked data? Etc. The more questions you answer up front, the better.
Content Strategy
Content strategy basically means thinking about the “planning, development, and management” of your intellectual content. Like project design, content strategy is an ongoing process, the details of which will vary from project to project. The fastest route to comprehending content strategy is to pick up copies of both The Elements of Content Strategy by Erin Kissane and Content Strategy for Mobile by Karen McGrane. Both books are from the excellent A Book Apart series. They’re cheap, well-designed, fast reads, available in many formats. Reading through one or both of these books will give you a sense of how you can improve both your content and your project design. See also: Adaptive Content.
Future-friendliness: Open Source, Metadata, Standards and Formats
Future-friendly thinking refers to the idea that, because technology is in a constant state of flux, we should create our content (and evaluate our projects and workflows) with inevitable change in mind. Likewise, due to the inevitability of technological and market changes, as well as user preferences and behaviors, we cannot truly know how our content will be consumed.
Let’s put this in more concrete terms. Let’s say you built a project that relied heavily on Flash some time before Steve Jobs effectively murdered that technology. Certainly by 2010, you knew that iPad and iPhone users (as well as a growing number of other mobile device users) could not access that Flash content. But what you may not have known – or cared about – before then is that Flash was never a very good bet for the future. For one, it wasn’t part of the browser; it required users to install a separate application on their computer, which is never a good thing. Not only was Flash content divorced from the browser, it was divorced from the web in that (on most sites built entirely in Flash) it was impossible to link to internal pages. In turn, this meant that Flash content would also be divorced from the other big innovation of the aughties web – social media. You can’t share what you can’t link. That usually meant internal pages were inaccessible to Google and other search engines as well. So much for discovery. Finally, Flash content (to my knowledge) was and is completely inaccessible to screen readers for the visually impaired. To the extent that Flash works on some mobile devices today, it is still barely usable in many instances because it was essentially a mouse and keyboard technology. The relatively sudden death of Flash understandably caught some people off guard but with hindsight, we can see why it needed to die.
A good web infrastructure is as agnostic as possible. It has baseline-functionality that improves (or can easily be improved upon) as standards evolve and enabling technologies improve. We can’t know what the next big thing will be but if we follow web standards, we can have a pretty reliable sense of what user devices are capable of rendering. If we structure our data, we’ll be prepared to send our content to mobile phones, tablets, smart watches, and heads up displays. If we use semantic markup and microformats and adhere to best practices in accessibility, we know our content can be re-purposed and consumed in a variety of contexts. When we do our best to avoid proprietary technologies, we do so to safeguard against single points of failure and against stagnation. When we carefully choose standard files formats and open source platforms, we give ourselves some level of control over our own future (even if it’s just the control needed to export our content to some other environment later on).
Visual and Functional Design
Visual design is a central part of your project whether you like it or not. Aesthetic quality is not just a nice-to-have feature, it’s part of establishing authority. It tells the user, among other things, how much (or how little) you care about quality and details. When a user visits your site, opens your app, uses your interface, or loads your file, the first thing they will do is (consciously or not) evaluate the presentation and begin trying to make sense of the organization. You don’t need to win design awards to get this right. At the very least, focus on the basics. Links should be identifiable, text should be legible and reading enjoyable, the layout should feel solid and intentional, pages should load quickly, content should be universally accessible and organized in a manner that makes intuitive and immediate sense.
Unless you are a Fortune 500 company, you probably don’t need hundreds of internal pages and dozens of top level navigation options. (I usually try to pare the top-level navigation down to 5 or fewer items. If you ever hire me for anything, I will brutally condense your content into logical chunks. Fair warning.) Keep things simple, focusing on consistency and clarity, removing and de-emphasizing inessential features. If you are in any way responsible for creating or evaluating interface design, consider reading up on Usability and Information Architecture.
You wouldn’t publish a book with no margins, an inscrutable index and a crummy cover. You wouldn’t submit an article to a journal that looked like it was printed on mimeograph. You wouldn’t wear sweatpants to give a lecture. It’s like that. The way it looks is part of how it works. Branding is not a dirty word, it’s just an expression of professionalism.
Maintenance & Preservation
If you are using a future-friendly, content-driven approach and you’ve done your due diligence during project design, you’ll be that much more prepared to think about long-term maintenance and preservation – issues of obvious import for scholars, especially historians. Using open source software and open standards, protocols and formats wherever possible helps to ensure that your content can be restored, recreated, migrated or otherwise accessed in the future. Likewise, using standard, open, and well-established technologies provides a way forward and a way out should technologies change drastically.
If your content is well-structured and is available via standard output formats (like CSV, XML, JSON, or even static HTML), you can move it from one context or platform to another with relative ease. If your site is hosted on a Linux server and managed via a PHP/MySQL application (like, say, WordPress), then you are like almost everyone else on the web today. That’s good, because when The Singularity comes and it’s time to migrate our content into quantum-powered meat-based neural networks or something, you won’t be the only one looking for a migration path. Or in less idiotic terms, if you choose to build your content using web standards and an open source infrastructure, you are much more likely to have access to critical software updates, useful extensions, management tools, and a professional community that understands how it all works.
Obviously, there’s much more to say about web publishing and public history, but this is where I’ll stop. Remember, it’s not every scholar’s job to think about these issues, but it might be yours.