After a year with Ubuntu

Official Ubuntu circle with wordmark. Replace ...
Image via Wikipedia

For about a year now, I’ve been using Ubuntu as my primary operating system for all of my work-related computing. For those who don’t know already, Ubuntu is one of many flavors of Linux — a freely available Open Source operating system.  By freely available, I mean just that: you can download it, burn it to a disk, and install it for free on just about any computer. Open Source software is a very cool idea in and of itself — companies and independent software developers essentially volunteer their time to develop a software product. And as the Open Source Initiative tells us “The promise of open source is better quality, higher reliability, more flexibility, lower cost, and an end to predatory vendor lock-in.” Ubuntu take this a step further; it differs slightly from other versions of Linux in that it is designed to be easy to use. Their target audience seems to be the everyday user, as opposed to those who want to run a server in a data center or who enjoy tinkering with the innards of their desktop operating systems. In addition, the developers are committed to totally free software. In some ways, Ubuntu feels like an ideology; their website reads almost like a manifesto:

We believe in fast, effective computing for everyone. Created by the open-source community and Canonical, Ubuntu is free to use and share, at home and in business.

The Ubuntu Promise

Ubuntu is free. Always has been and always will be. From the operating system to security updates, storage to software.

Ubuntu is fast to load, easy to use, available in most languages and accessible to all.

Ubuntu applications are all free and open source – so you can share them with anyone you like, as often as you like.

Ubuntu comes with full support and all kinds of services available worldwide.

With a promise like this, I couldn’t help but be intrigued. Their philosophy (and the Open Source philosophy in general) appeals to my left-leaning political views and the idealist who is still somewhere inside of me. I was also curious (especially given the recent downturn in the economy) about Ubuntu as an alternative to the two somewhat expensive OSes that we purchase at Wheaton. So, I turned in my Mac, got a Dell laptop from our Technical Support group here at Wheaton, and downloaded and installed Ubuntu alongside Windows XP (which came with the computer). When I boot up the computer, I can choose either Ubuntu (where all of my data and most of my applications live) or Windows.

screenshot of software center

The Ubuntu Software Center Application

For the most part, I have been happy with Ubuntu for the past year. Occasionally I  have had to run Windows on the computer — a similar situation to working with a Mac — but for the most part I’ve been able to work with the free software that either came with Ubuntu or that’s available for download through the Ubunutu Software Center application. I have been able to use Firefox and Chrome for browsers (where much of my work seems to occur these days), Tweetdeck for twitter (yes, AdobeAir runs on Linux), Open Office for productivity (Word Processing, Spreadsheets, Presentation), Empathy for Instant Messaging, GIMP for photo editing, even some minor video editing with Kino. There are even some nice helper applications like Gnome DO, which provides users with a nice search interface and a Mac-like Dock.

Alas, all has not been rosy with this free, “easy to use” operating system. In the end, the downsides are still a little too significant for me, and I have recently made the decision to return to a standard OS — a dual boot Mac/Windows machine. The problems?

  1. Browser-based Applications: Ubuntu does not work with several of the browser-based applications out there: Elluminate being the most significant for me (an application that NITLE and several other professional organizations have been using for a lot of their online workshops recently), but the reporting tool (called WebFocus) for Wheaton’s online our financial system (Banner) did not work either. Netflix streaming and other sites that use the Microsoft video tool do not work either.

    What you get when you visit Netflix instant streaming.

  2. Compatibility: The free software I listed above is mostly compatible with the software everyone else on campus is using. That doesn’t sound like a downside at first; “mostly compatible” sounds great. But it does wear on you after a while. Sure, Open Office can open Excel, Word, and PowerPoint files, but often the formatting was slightly off, sometimes making documents or PowerPoint slides hard to read. And certain features like pivot tables in Excel don’t quite work. The same would happen with files that I sent others when they opened them on their computers
  3. Drivers: I could get by with those inconveniences, because I did have Windows available to me (after rebooting the computer), but the most troublesome part of Ubuntu for me has been the drivers — the little pieces of software that run in the background that make things like your screen, your keyboard, your touchpad or anything else attached to your computer work. Don’t get me wrong, Ubuntu has come a long way — most of the drivers work just fine with little to no effort. However, this is another instance where things being “mostly compatible” can become irritating after a while. It took me a long time, for example, to get my computer to wake up properly after the lid was shut on  it. And on a recent upgrade (to version 10.04), my video driver stopped working, so that the screen only displayed in low resolution and I lost my ability to print.
  4. No tech support: This is not a problem with Ubuntu itself. Their site and forums are chocked full of information and helpful people. But it’s still an external site and I am still ultimately responsible for solving all of my problems on Ubuntu. Our small Tech Support group has its hands full trying to support Windows and Mac; understandably, they just aren’t at a point where they can take on Ubuntu too.

So, I am leaving Ubuntu behind… at least for now. For the most part, it works as an operating system. I lived on it for a year with few problems. The developers have come very close to their goal of “fast, effective computing for everyone.” I think I just need to give it a little longer to bake before I adopt it as my full time OS again.

It also bodes well for the Open Source movement in general that a desktop/laptop operating system like Ubuntu is so close to being ready for prime time. At Wheaton, we can see how well the movement is doing in other venues as well: we are adopting more and more Open Source tools. Examples include Moodle (AKA onCourse for Learning Management), WordPress (for blogs, but also our new web editor/content management system), MDID (our image database),  and SubjectsPlus (a library subject research guide). With that in mind and especially after quoting Ubuntu’s manifesto-like promise, I almost want to shout out loud: Viva La Open Source!


What are the top IT issues for small Liberal Arts colleges?

As one fiscal and academic year comes to an end and another is about to begin at Wheaton, I’m starting starting to prepare materials for our department’s year-end annual report. This not only provides me with a chance to reflect on the last year, but also to look forward to what the challenges and opportunities for the coming year will be. So, for that reason alone, the title of this post has been on my mind these days. But I’m interested in this topic for several other reasons as well.

First off, Wheaton is about to get a new Associate Vice President for Library and Information Services (and I’m about to get a new boss)! This is great news, because this position has been vacant for about a year now and while we have certainly been able to keep the trains running and even been able to make some good progress in certain areas, it will be nice to have some leadership within LIS again that will help us focus on addressing the top issues for both IT and Libraries.

This is also on my mind, because it’s been covered in a number of venues recently.The current issue of the Educause Review, for example, gives a “Top 10” (11 actually, because two of them tied for 6th place). Here they are:

    1. Funding IT
    2. Administrative/ERP/Information Systems
    3. Security
    4. Teaching and Learning with Technology
    5. Identity/Access Management
    6. (tie). Disaster Recovery / Business Continuity
    6. (tie). Governance, Organization, and Leadership
    7. Agility, Adaptability, and Responsiveness
    8. Learning Management Systems
    9. Strategic Planning
    10. Infrastructure/Cyberinfrastructure

This list is pretty good, though not entirely surprising, and I must admit I find some of the categories a little too broad. The article does a decent job of outlining the major questions under each topic, though. I for one am happy to see “Teaching and Learning with Technology” and “Learning Management Systems” featured so prominently in this list, as they are near and dear to my professional heart. And I would agree with what I think Issue #6b and Issue # 7 imply:  IT organizations (and merged organizations like those at Wheaton) need to develop better ways of prioritizing their projects and services in a time of fiscal austerity and in an environment of technological advances which require our organizations to remain “agile, adaptable, and responsive.” As the article puts it:

Keeping one foot in the present and the other in the future is the charge to which IT organizations and leadership must answer. Cloud-based applications and services, such as Gmail, as well as sophisticated consumer technologies, such as smartphones that rival the features of laptop computers, are entering campus technological environments at unprecedented rates. As more stakeholders seek the flexibility, functionality, and convenience of these new devices and systems, IT organizations must strive to meet their evolving needs and expectations. Such changes in behavior not only impact traditional IT support models but also challenge deeply rooted institutional policies, business processes, and operational practices.

Finally, I started this blog post a while ago (and let it sit in draft form for far too long), as a way to digest a recent conference I attended in June: the annual conference of the Consortium for Liberal Arts Colleges. Bryan Alexander addressed this question by looking to the future. In his a keynote entitled Liberal Arts Campuses in 2015: five visions, he lays out five potential scenarios for IT in the liberal arts in 2015 based on current trends:

  • Digital Balkanization.  Silos are the norm, as an increasing amount of content and software  are located in separate platforms.  Academic life reflects this in many ways, directly and otherwise. (opposite of Open World, below)
  • The Long Great Recession.  The American economy remains flat, never recovering fully from the crash of 2008.  Campus budgets have flattened in response, and academic life has changed in other ways.
  • The Open World. Open content, open access, and open source are the norm.  (opposite of Digital Balkanization, above)
  • A World of Points.  Gaming is the world’s leading culture industry.  At the same time, our normative behaviors and interactions are shaped by gaming practices and role-playing.  Academia has started changing in response.
  • Imbrication Nation.  In a world where networked mobile devices are the norm, augmented reality is now mainstream.

Take a look at his blog post about it on the NITLE site, which also links to his Prezi presentation, for more. It’s a great way to think about how these trends and perhaps even how our actions as IT leaders could affect how we interact with information in the future at our institutions.

In addition to this and other great presentations and conversations that occurred at the conference, there was a really interesting thread on the CLAC listserv right before the conference started, subject line “What’s on my mind.” This conversation gave me a good view into what CIO’s at small liberal arts colleges from around the country are thinking about. I probably should not directly quote the content from that listserv, because I’m not sure if it is for public consumption. But I think I can at least summarize the major topics that arose from both that conversation and the conversations at the conference — at least from my point of view. In no particular order they were:

  1. Exploring alternatives to traditional technologies: There were several questions about use of Voice over IP (VOIP) instead of analog phones, whether cable TV was necessary in the dorms given content now available over the web, and whether wired connections to the internet in the dorms were necessary anymore.
  2. Data security: In this category, there were concerns about keeping certain kinds of data private and controlling access to information through effective forms of identity and access management.
  3. Mobile Devices: The iPad was in many people’s hands at this conference, and I think it and other mobile devices were on people’s minds. Do we provide them? Do we provide content for them? Will they replace the laptop?
  4. The Future of the Learning Management System: Many small colleges have now moved to open source LMSes. Some are still thinking about it. Some are wondering if other technologies (e.g. blogs and wikis) will one day supplant the LMS.
  5. Handling high demand for services with a small(er) staff: Some institutions have had staff reductions, others because of their size had a small staff to start with. But we are living in an increasingly rich technological environment and our user community have increasing expectations for our services. How do we meet those needs and manage expectations?
  6. Getting value out of the ERP: ERP (Enterprise resource planning) software (e.g. Banner or PeopleSoft) is integral to a colleges’ business operations. But this software is also hard to use and expensive to maintain. How do we get the best value out of these systems? And what do projects like Kuali mean for small colleges?
  7. Funding replacements: How do we continue to fund replacing desktops, infrastructure, and classroom technology when our budgets are shrinking?
  8. The Cloud/Outsourcing: Do we look to cloud services like those that Google provides to save money and improve services? Do they really do both of those things?
  9. Managing Projects: How do we set priorities for our projects in IT? And how can we both plan well and be agile?
  10. Managing relations with other departments: More and more IT organizations need to collaborate with other departments on campus. Contact with the Communications department, who are now managing web sites, is one obvious example where this is happening… but it’s occurring in other places as well.

As I finish up this blog post, I am realizing that it is probably an overly ambitious one. I’m sure I haven’t covered everything. Perhaps I’ve done too much! But blogs don’t need to be the final word, right? I wanted to at least make sure that I captured what I’ve been hearing and thinking as I move forward into a new year and under new leadership within our merged organization. I’m sure this is a question I’ll be returning to again and again, and that we’ll be returning to as an organization.

Digital Humanities on the Rise at Small Liberal Arts Colleges

I don’t have any hard figures or statistics, but my sense is that the title of this post is true. There does seem to be a growing interest — perhaps a groundswell? —  in incorporating Digital Humanities into scholarship, pedagogy, and the curriculum at Small Liberal Arts colleges.

What do I mean by Digital Humanities? I don’t mean just doing Humanities scholarship using a computer. Typing up a traditional paper in Word doesn’t count. Posting a pdf of a paper to the web doesn’t really count either (though I will admit that that act does bring up interesting questions about how you define the word “publish” nowadays — a related but separate topic).  If we think of Humanities as the study of the human culture and the human condition through the analysis of artifacts (texts, images, video, arctitecture, sculpture, and so on) in the human record, Digital Humanities simply extends that definition to include using digital technology to assist in that analysis. Usually this involves digitizing these cultural artifacts — often in the name of preservation — but also, as Susan Schreibman, Ray Siemens, and John Unsworth point out, to facilitate and enhance scholarly analysis of those artifacts:

Yet many disciplines have gone beyond simply wishing to preserve these artifacts, what we might now call early forms of data management, to re-represent and manipulate them to reveal properties and traits not evident when the artifact was in its native form. Moreover, digital humanities now also concerns itself with the creation of new artifacts which are born digital and require rigorous study and understanding in their own right.

Susan Hockey says something similar in her essay that chronicles the history of Digital Humanities in the same book and notes that it brings together the methodologies of the sciences and the humanities:

by its very nature, humanities computing has had to embrace “the two cultures”, to bring the rigor and systematic unambiguous procedural methodologies characteristic of the sciences to address problems within the humanities that had hitherto been most often treated in a serendipitous fashion.

So, getting back to the title of this blog post… why do I think it’s on the rise at Small Liberal Arts colleges and universities? When I look around, I see Digital Scholarship centers appearing like the ones at Occidental College and the University of Richmond. Hamilton College has started a Digital Humanities Initiative. At the University of Puget Sound, the Director of the Humanities program, Kent Hooper, teaches a class to undergraduates entitled Digital Humanities. Here at Wheaton, faculty have started a Digital Humanities working group. We also have several digital humanities projects underway — a few that use methods from the Text Encoding Initiative to digitize and analyze texts, others that involve contributing to projects like the Diderot Encyclopedia translation project and the History Engine, and one using Lexomics — an approach which uses Computer Science and statistics to look for patterns in Old English texts. And as I write this, our college Archivist (Zeph Stickney) and an Associate Professor of History (Kathryn Tomasek) are attending and presenting a poster at the Digital Humanities conference in London.

The National Institute for Technology in Liberal Education (NITLE) has taken notice and has gathered a group of faculty members, librarians, and technologists together from Occidental College, Willamette University, Hamilton College, and Wheaton to plan a series of online seminars, which (according to Bob Kieft, College Librarian, Occidental College):

will showcase a variety of projects and issues in digital scholarship, [will address] these important themes:

  • connection to the undergraduate curriculum,
  • collaboration between faculty, technologists and librarians, and
  • strategies to cope with limited resources on liberal arts campuses.

And I’m sure there’s a lot more going on! This seems like a significant shift to me, because Digital Humanities used to be primarily within the purview of larger research institutions, because (as the last bullet above implies) these projects have been too expensive and large an undertaking for small colleges.

So, as a leader of a group at Wheaton focused on technology for research and instruction, part of my job is to figure out how to encourage and facilitate collaborations between faculty, technologists, archivists, and librarians and to work with others to find ways to make Digital Humanities projects manageable and sustainable. The IMLS-funded project that I am working on with colleagues from multiple small colleges aims to address this goal for one kind of Digital Humanities scholarship. “Publishing TEI Documents for Small Colleges” (a project still in need of a better name — any suggestions?) is attempting to help scholars, archivists, librarians, technologists, and students working on encoding scholarly texts with TEI-compliant XML find good, sustainable ways to store, represent, analyze and provide access to those materials online. We can all create the XML documents, but individually we don’t have the resources to build effective tools to store and use them. The way forward in our view is through collaboration between institutions — in this case through a shared service. And I think that will need to be the model for other kinds of Digital Humanities projects (video, GIS, image archives, and so on) occurring at small institutions (and even large institutions!) as well. If we want to ride this groundswell and succeed with Digital Humanities at our individual campuses, we need to look more toward sharing our resources.

Screencasts in the Liberal Arts?

Screencasting — a video recording of a computer screen — has been in use for a while now, often as a way to demonstrate how to use software. We have not used it much at Wheaton, in part because like most small liberal arts colleges, we place a lot of value on personalized, face-to-face time. We have a 12:1 student to faculty ratio in classes after all and students, faculty, and staff at Wheaton have all come to expect individualized attention. This is a strength of a small college. So, the idea of putting a recorded tutorial up online seems antithetical to this environment. Why would a faculty member want to watch a recording about how to use software, when s/he can pick up the phone or in many cases just walk down the hall and ask for help from a faculty technology liaison? Why would a student want to watch an online tutorial when they can get face-to-face help with a faculty member, a librarian, or a technologist?

For the most part, this thought process still holds true… but I think it’s also changing, or has changed. Sure, face-to-face time is great, maybe even preferred, but so is getting the information when I want it, on my own terms, 24/7.  DVRs, online services like Netflix, radio podcasts, YouTube, 24/7 shopping on Amazon, the web itself — these have all taught us to expect access to information on-demand, when we want it, and where we want it. I’m sure face-to-face will always be the preference, but 24/7 information sure is a nice substitute.

So, when two faculty technology liaisons (Jeanne Farrell and Diane Demelo) said they wanted to do some screencasts for faculty about Moodle, my first reaction was … sure, I guess, but  who will watch it? Faculty want to learn directly from you, right? But then I remembered many faculty members who told me they had tried Moodle on their own and several others asking me over the past year whether we were recording our Tech N Talks, so they could watch them later. Many of our faculty members want to learn things on their own schedule, and — influenced by their experiences with the web — they want access to this kind of information 24/7. So, if you take their environment into consideration, screencasts make perfect sense. These screencasts aren’t quite a substitute for the personalized attention that we will continue to give, but I think they do address many of our faculty’s needs.

We’ll see. I’m curious to see how well they are received.

Here are three that we just started with. These are addressing new features available in our instance of Moodle after an upgrade. For those who are interested, these were recorded with Jing, which you can download and use for free. We bought the $14 per year license, because we wanted to post to YouTube (instead of and edit the video. But I have used the free version of the software and have been quite satisfied with it. Enjoy!

Unbloggable Things?

Of course, as soon as I plan to start my professional blogging up again (my thought was I could write something once a week), something happened at work that is rather difficult (nearly impossible?) to write about.  Originally, I wrote something longer asking questions about why this topic is so “unbloggable,” but I’ve decided instead that it really is… and just to link to this public article from the Providence Business News: Wheaton lays off 17; economy blamed

Back to Blogging? Thoughts about presentations…

I’m back!


After nearly two years of letting this blog languish (Twitter is so much easier!), I’m posting something again.

Me presenting with Prezi at a NITLE conferenceI just finished presenting with Bryan Alexander, NITLE’s Director of Research; Trina Marmarelli, Instructional Technologist, Reed College; and Bill Junkin, Director of Instructional Technology, Eckerd College about alternatives to the standard PowerPoint presentation. Bryan gave a nice introduction to how people are starting to break away from bulleted slides. Trina talked about a new form that people are imposing on PowerPoint called Pecha-kucha. Bill gave a demonstration of Ubiquitous Presenter — a tool that allows a professor to annotate Powerpoint slides on a tablet PC while students “tag” the them. And I presented on a web-based presentation tool called Prezi, for which I created this screen cast. (The picture with this post is me using Prezi for the first time at the NITLE Instructional Technology Leader’s Conference.)

I think we just scratched the surface of this topic in the hour that we had, but it was a great start. The way that people are doing presentations is changing … or has changed … and  in a good way! New forms like Pecha-kucha and the Lessig Method,  and new technologies like Prezi and Ubiquitous Presenter, which themselves encourage different approaches to presentations, are shaking us out of the deadly bulleted slide and encouraging us to communicate more clearly with our audiences.

When we are thinking about Information/Technology Literacy/Fluency for our students, this should be part of the conversation. We are well past the time when undergraduate students have to learn the mechanics of Powerpoint. Most of our students have been using Powerpoint since Middle School, sometimes even Elementary. What they still need to learn (and what we should be teaching them) is how to use this and similar tools to communicate effectively. And the stuff we covered today, it seems to me, could play a big part in that.