I Hate Books

I want books dead. For social reasons.

In a way, this is a followup to a discussion happening on Facebook after something I posted (available publicly on Twitter): “(Alexandre) wishes physical books a quick and painfree death. / aime la connaissance.”

As I expected, the reactions I received were from friends who are aghast: how dare I dismiss physical books? Don’t I know no shame?

Apparently, no, not in this case.

And while I posted it as a quip, it’s the result of a rather long reflection. It’s not that I’m suddenly anti-books. It’s that I stopped buying several of the “pro-book” arguments a while ago.

Sure, sure. Books are the textbook case of technlogy which needs no improvement. eBooks can’t replace the experience of doing this or that with a book. But that’s what folkloristics defines as a functional shift. Like woven baskets which became objects of nostalgia, books are being maintained as the model for a very specific attitude toward knowledge construction based on monolithic authored texts vetted by gatekeepers and sold as access to information.

An important point, here, is that I’m not really thinking about fiction. I used to read two novel-length works a week (collections of short stories, plays…), for a period of about 10 years (ages 13 to 23). So, during that period, I probably read about 1,000 novels, ranging from Proust’s Recherche to Baricco’s Novecentoand the five books of Rabelais’s Pantagruel series. This was after having read a fair deal of adolescent and young adult fiction. By today’s standards, I might be considered fairly well-read.

My life has changed a lot, since that time. I didn’t exactly stop reading fiction but my move through graduate school eventually shifted my reading time from fiction to academic texts. And I started writing more and more, online and offline.
In the same time, the Web had also been making me shift from pointed longform texts to copious amounts of shortform text. Much more polyvocal than what Bakhtin himself would have imagined.

(I’ve also been shifting from French to English, during that time. But that’s almost another story. Or it’s another part of the story which can reamin in the backdrop without being addressed directly at this point. Ask, if you’re curious.)
The increase in my writing activity is, itself, a shift in the way I think, act, talk… and get feedback. See, the fact that I talk and write a lot, in a variety of circumstances, also means that I get a lot of people to play along. There’s still a risk of groupthink, in specific contexts, but one couldn’t say I keep getting things from the same perspective. In fact, the very Facebook conversation which sparked this blogpost is an example, as the people responding there come from relatively distant backgrounds (though there are similarities) and were not specifically queried about this. Their reactions have a very specific value, to me. Sure, it comes in the form of writing. But it’s giving me even more of something I used to find in writing: insight. The stuff you can’t get through Google.

So, back to books.

I dislike physical books. I wish I didn’t have to use them to read what I want to read. I do have a much easier time with short reading sessions on a computer screen that what would turn into rather long periods of time holding a book in my hands.

Physical books just don’t do it for me, anymore. The printing press is, like, soooo 1454!

Yes, books had “a good run.” No, nothing replaces them. That’s not the way it works. Movies didn’t replace theater, television didn’t replace radio, automobiles didn’t replace horses, photographs didn’t replace paintings, books didn’t replace orality. In fact, the technology itself doesn’t do much by itself. But social contexts recontextualize tools. If we take technology to be the set of both tools and the knowledge surrounding it, technology mostly goes through social processes, since tool repertoires and corresponding knowledge mostly shift in social contexts, not in their mere existence. Gutenberg’s Bible was a “game-changer” for social, as well as technical reasons.

And I do insist on orality. Journalists and other “communication is transmission of information” followers of Shannon&Weaver tend to portray writing as the annihilation of orality. How long after the invention of writing did Homer transfer an oral tradition to the writing media? Didn’t Albert Lord show the vitality of the epic well into the 20th Century? Isn’t a lot of our knowledge constructed through oral means? Is Internet writing that far, conceptually, from orality? Is literacy a simple on/off switch?

Not only did I maintain an interest in orality through the most book-focused moments of my life but I probably care more about orality now than I ever did. So I simply cannot accept the idea that books have simply replaced the human voice. It doesn’t add up.

My guess is that books won’t simply disappear either. There should still be a use for “coffee table books” and books as gifts or collectables. Records haven’t disappeared completely and CDs still have a few more days in dedicated stores. But, in general, we’re moving away from the “support medium” for “content” and more toward actual knowledge management in socially significant contexts.

In these contexts, books often make little sense. Reading books is passive while these contexts are about (hyper-)/(inter-)active.

Case in point (and the reason I felt compelled to post that Facebook/Twitter quip)…
I hear about a “just released” French book during a Swiss podcast. Of course, it’s taken a while to write and publish. So, by the time I heard about it, there was no way to participate in the construction of knowledge which led to it. It was already “set in stone” as an “opus.”

Looked for it at diverse bookstores. One bookstore could eventually order it. It’d take weeks and be quite costly (for something I’m mostly curious about, not depending on for something really important).

I eventually find it in the catalogue at BANQ. I reserve it. It wasn’t on the shelves, yet, so I had to wait until it was. It took from November to February. I eventually get a message that I have a couple of days to pick up my reservation but I wasn’t able to go. So it went back on the “just released” shelves. I had the full call number but books in that section aren’t in their call number sequence. I spent several minutes looking back and forth between eight shelves to eventually find out that there were four more shelves in the “humanities and social sciences” section. The book I was looking was on one of those shelves.

So, I was able to borrow it.

Phew!

In the metro, I browse through it. Given my academic reflex, I look for the back matter first. No bibliography, no index, a ToC with rather obscure titles (at random: «Taylor toujours à l’œuvre»/”Taylor still at work,” which I’m assuming to be a reference to continuing taylorism). The book is written by two separate dudes but there’s no clear indication of who wrote what. There’s a preface (by somebody else) but no “acknowledgments” section, so it’s hard to see who’s in their network. Footnotes include full URLs to rather broad sites as well as “discussion with <an author’s name>.” The back cover starts off with references to French popular culture (including something about “RER D,” which would be difficult to search). Information about both authors fits in less than 40 words (including a list of publication titles).

The book itself is fairly large print, ways almost a pound (422g, to be exact) for 327 pages (including front and back matter). Each page seems to be about 50 characters per line, about 30 lines per page. So, about half a million characters or 3500 tweets (including spaces). At 5+1 characters per word, about 80,000 words (I have a 7500-words blogpost, written in an afternoon). At about 250 words per minute, about five hours of reading. This book is listed at 19€ (about 27CAD).
There’s no direct way to do any “postprocessing” with the text: no speech synthesis for visually impaired, concordance analysis, no machine translation, even a simple search for occurences of “Sarkozy” is impossible. Not to mention sharing quotes with students or annotating in an easy-to-retrieve fashion (à la Diigo).

Like any book, it’s impossible to read in the dark and I actually have a hard time to find a spot where I can read with appropriate lighting.

Flipping through the book, I get the impression that there’s some valuable things to spark discussions, but there’s also a whole lot of redundancy with frequent discussions on the topic (the Future of Journalism, or #FoJ, as a matter of fact). My guesstimate is that, out of 5 hours of reading, I’d get at most 20 pieces of insight that I’d have exactly no way to find elsewhere. Comparable books to which I listened as audiobooks, recently, had much less. In other words, I’d have at most 20 tweets worth of things to say from the book. Almost a 200:1 compression.
Direct discussion with the authors could produce much more insight. The radio interviews with these authors already contained a few insight hints, which predisposed me to look for more. But, so many months later, without the streams of thought which animated me at the time, I end up with something much less valuable than what I wanted to get, back in November.

Bottomline: Books aren’t necessarily “broken” as a tool. They just don’t fit my life, anymore.

Advertisement

Why I Need an iPad

I’m one of those who feel the iPad is the right tool for the job.

I’m one of those who feel the iPad is the right tool for the job.

This is mostly meant as a reply to this blogthread. But it’s also more generally about my personal reaction to Apple’s iPad announcement.

Some background.

I’m an ethnographer and a teacher. I read a fair deal, write a lot of notes, and work in a variety of contexts. These days, I tend to spend a good amount of time in cafés and other public places where I like to work without being too isolated. I also commute using public transit, listen to lots of podcast, and create my own. I’m also very aural.

I’ve used a number of PDAs, over the years, from a Newton MessagePad 130 (1997) to a variety of PalmOS devices (until 2008). In fact, some people readily associated me with PDA use.

As soon as I learnt about the iPod touch, I needed one. As soon as I’ve heard about the SafariPad, I wanted one. I’ve been an intense ‘touch user since the iPhone OS 2.0 release and I’m a happy camper.

(A major reason I never bought an iPhone, apart from price, is that it requires a contract.)

In my experience, the ‘touch is the most appropriate device for all sorts of activities which are either part of an other activity (reading during a commute) or are simply too short in duration to constitute an actual “computer session.” You don’t “sit down to work at your ‘touch” the way you might sit in front of a laptop or desktop screen. This works great for “looking up stufff” or “checking email.” It also makes a lot of sense during commutes in crowded buses or metros.

In those cases, the iPod touch is almost ideal. Ubiquitous access to Internet would be nice, but that’s not a deal-breaker. Alternative text-input methods would help in some cases, but I do end up being about as fast on my ‘touch as I was with Graffiti on PalmOS.

For other tasks, I have a Mac mini. Sure, it’s limited. But it does the job. In fact, I have no intention of switching for another desktop and I even have an eMachines collecting dust (it’s too noisy to make a good server).

What I miss, though, is a laptop. I used an iBook G3 for several years and loved it. For a little while later, I was able to share a MacBook with somebody else and it was a wonderful experience. I even got to play with the OLPC XO for a few weeks. That one was not so pleasant an experience but it did give me a taste for netbooks. And it made me think about other types of iPhone-like devices. Especially in educational contexts. (As I mentioned, I’m a teacher)

I’ve been laptop-less for a while, now. And though my ‘touch replaces it in many contexts, there are still times when I’d really need a laptop. And these have to do with what I might call “mobile sessions.”

For instance: liveblogging a conference or meeting. I’ve used my ‘touch for this very purpose on a good number of occasions. But it gets rather uncomfortable, after a while, and it’s not very fast. A laptop is better for this, with a keyboard and a larger form factor. But the iPad will be even better because of lower risks of RSI. A related example: just imagine TweetDeck on iPad.

Possibly my favourite example of a context in which the iPad will be ideal: presentations. Even before learning about the prospect of getting iWork on a tablet, presentations were a context in which I really missed a laptop.

Sure, in most cases, these days, there’s a computer (usually a desktop running XP) hooked to a projector. You just need to download your presentation file from Slideshare, show it from Prezi, or transfer it through USB. No biggie.

But it’s not the extra steps which change everything. It’s the uncertainty. Even if it’s often unfounded, I usually get worried that something might just not work, along the way. The slides might not show the same way as you see it because something is missing on that computer or that computer is simply using a different version of the presentation software. In fact, that software is typically Microsoft PowerPoint which, while convenient, fits much less in my workflow than does Apple Keynote.

The other big thing about presentations is the “presenter mode,” allowing you to get more content than (or different content from) what the audience sees. In most contexts where I’ve used someone else’s computer to do a presentation, the projector was mirroring the computer’s screen, not using it as a different space. PowerPoint has this convenient “presenter view” but very rarely did I see it as an available option on “the computer in the room.” I wish I could use my ‘touch to drive presentations, which I could do if I installed software on that “computer in the room.” But it’s not something that is likely to happen, in most cases.

A MacBook solves all of these problems. and it’s an obvious use for laptops. But how, then, is the iPad better? Basically because of interface. Switching slides on a laptop isn’t hard, but it’s more awkward than we realize. Even before watching the demo of Keynote on the iPad, I could simply imagine the actual pleasure of flipping through slides using a touch interface. The fit is “natural.”

I sincerely think that Keynote on the iPad will change a number of things, for me. Including the way I teach.

Then, there’s reading.

Now, I’m not one of those people who just can’t read on a computer screen. In fact, I even grade assignments directly from the screen. But I must admit that online reading hasn’t been ideal, for me. I’ve read full books as PDF files or dedicated formats on PalmOS, but it wasn’t so much fun, in terms of the reading process. And I’ve used my ‘touch to read things through Stanza or ReadItLater. But it doesn’t work so well for longer reading sessions. Even in terms of holding the ‘touch, it’s not so obvious. And, what’s funny, even a laptop isn’t that ideal, for me, as a reading device. In a sense, this is when the keyboard “gets in the way.”

Sure, I could get a Kindle. I’m not a big fan of dedicated devices and, at least on paper, I find the Kindle a bit limited for my needs. Especially in terms of sources. I’d like to be able to use documents in a variety of formats and put them in a reading list, for extended reading sessions. No, not “curled up in bed.” But maybe lying down in a sofa without external lighting. Given my experience with the ‘touch, the iPad is very likely the ideal device for this.

Then, there’s the overall “multi-touch device” thing. People have already been quite creative with the small touchscreen on iPhones and ‘touches, I can just imagine what may be done with a larger screen. Lots has been said about differences in “screen real estate” in laptop or desktop screens. We all know it can make a big difference in terms of what you can display at the same time. In some cases, two screens isn’t even a luxury, for instance when you code and display a page at the same time (LaTeX, CSS…). Certainly, the same qualitative difference applies to multitouch devices. Probably even more so, since the display is also used for input. What Han found missing in the iPhone’s multitouch was the ability to use both hands. With the iPad, Han’s vision is finding its space.

Oh, sure, the iPad is very restricted. For instance, it’s easy to imagine how much more useful it’d be if it did support multitasking with third-party apps. And a front-facing camera is something I was expecting in the first iPhone. It would just make so much sense that a friend seems very disappointed by this lack of videoconferencing potential. But we’re probably talking about predetermined expectations, here. We’re comparing the iPad with something we had in mind.

Then, there’s the issue of the competition. Tablets have been released and some multitouch tablets have recently been announced. What makes the iPad better than these? Well, we could all get in the same OS wars as have been happening with laptops and desktops. In my case, the investment in applications, files, and expertise that I have made in a Mac ecosystem rendered my XP years relatively uncomfortable and me appreciate returning to the Mac. My iPod touch fits right in that context. Oh, sure, I could use it with a Windows machine, which is in fact what I did for the first several months. But the relationship between the iPhone OS and Mac OS X is such that using devices in those two systems is much more efficient, in terms of my own workflow, than I could get while using XP and iPhone OS. There are some technical dimensions to this, such as the integration between iCal and the iPhone OS Calendar, or even the filesystem. But I’m actually thinking more about the cognitive dimensions of recognizing some of the same interface elements. “Look and feel” isn’t just about shiny and “purty.” It’s about interactions between a human brain, a complex sensorimotor apparatus, and a machine. Things go more quickly when you don’t have to think too much about where some tools are, as you’re working.

So my reasons for wanting an iPad aren’t about being dazzled by a revolutionary device. They are about the right tool for the job.

Installing BuddyPress on a Webhost

Installing BuddyPress on a FatCow-hosted site. With ramblings.

[Jump here for more technical details.]

A few months ago, I installed BuddyPress on my Mac to try it out. It was a bit of an involved process, so I documented it:

WordPress MU, BuddyPress, and bbPress on Local Machine « Disparate.

More recently, I decided to get a webhost. Both to run some tests and, eventually, to build something useful. BuddyPress seems like a good way to go at it, especially since it’s improved a lot, in the past several months.

In fact, the installation process is much simpler, now, and I ran into some difficulties because I was following my own instructions (though adapting the process to my webhost). So a new blogpost may be in order. My previous one was very (possibly too) detailed. This one is much simpler, technically.

One thing to make clear is that BuddyPress is a set of plugins meant for WordPress µ (“WordPress MU,” “WPMU,” “WPµ”), the multi-user version of the WordPress blogging platform. BP is meant as a way to make WPµ more “social,” with such useful features as flexible profiles, user-to-user relationships, and forums (through bbPress, yet another one of those independent projects based on WordPress).

While BuddyPress depends on WPµ and does follow a blogging logic, I’m thinking about it as a social platform. Once I build it into something practical, I’ll probably use the blogging features but, in a way, it’s more of a tool to engage people in online social activities. BuddyPress probably doesn’t work as a way to “build a community” from scratch. But I think it can be quite useful as a way to engage members of an existing community, even if this engagement follows a blogger’s version of a Pareto distribution (which, hopefully, is dissociated from elitist principles).

But I digress, of course. This blogpost is more about the practical issue of adding a BuddyPress installation to a webhost.

Webhosts have come a long way, recently. Especially in terms of shared webhosting focused on LAMP (or PHP/MySQL, more specifically) for blogs and content-management. I don’t have any data on this, but it seems to me that a lot of people these days are relying on third-party webhosts instead of relying on their own servers when they want to build on their own blogging and content-management platforms. Of course, there’s a lot more people who prefer to use preexisting blog and content-management systems. For instance, it seems that there are more bloggers on WordPress.com than on other WordPress installations. And WP.com blogs probably represent a small number of people in comparison to the number of people who visit these blogs. So, in a way, those who run their own WordPress installations are a minority in the group of active WordPress bloggers which, itself, is a minority of blog visitors. Again, let’s hope this “power distribution” not a basis for elite theory!

Yes, another digression. I did tell you to skip, if you wanted the technical details!

I became part of the “self-hosted WordPress” community through a project on which I started work during the summer. It’s a website for an academic organization and I’m acting as the organization’s “Web Guru” (no, I didn’t choose the title). The site was already based on WordPress but I was rebuilding much of it in collaboration with the then-current “Digital Content Editor.” Through this project, I got to learn a lot about WordPress, themes, PHP, CSS, etc. And it was my first experience using a cPanel- (and Fantastico-)enabled webhost (BlueHost, at the time). It’s also how I decided to install WordPress on my local machine and did some amount of work from that machine.

But the local installation wasn’t an ideal solution for two reasons: a) I had to be in front of that local machine to work on this project; and b) it was much harder to show the results to the person with whom I was collaborating.

So, in the Fall, I decided to get my own staging server. After a few quick searches, I decided HostGator, partly because it was available on a monthly basis. Since this staging server was meant as a temporary solution, HG was close to ideal. It was easy to set up as a PayPal “subscription,” wasn’t that expensive (9$/month), had adequate support, and included everything that I needed at that point to install a current version of WordPress and play with theme files (after importing content from the original site). I’m really glad I made that decision because it made a number of things easier, including working from different computers, and sending links to get feedback.

While monthly HostGator fees were reasonable, it was still a more expensive proposition than what I had in mind for a longer-term solution. So, recently, a few weeks after releasing the new version of the organization’s website, I decided to cancel my HostGator subscription. A decision I made without any regret or bad feeling. HostGator was good to me. It’s just that I didn’t have any reason to keep that account or to do anything major with the domain name I was using on HG.

Though only a few weeks elapsed since I canceled that account, I didn’t immediately set out to transition to a new webhost. I didn’t go from HostGator to another webhost.

But having my own webhost still remained at the back of my mind as something which might be useful. For instance, while not really making a staging server necessary, a new phase in the academic website project brought up a sandboxing idea. Also, I went to a “WordPress Montreal” meeting and got to think about further WordPress development/deployment, including using BuddyPress for my own needs (both as my own project and as a way to build my own knowledge of the platform) instead of it being part of an organization’s project. I was also thinking about other interesting platforms which necessitate a webhost.

(More on these other platforms at a later point in time. Bottom line is, I’m happy with the prospects.)

So I wanted a new webhost. I set out to do some comparison shopping, as I’m wont to do. In my (allegedly limited) experience, finding the ideal webhost is particularly difficult. For one thing, search results are cluttered with a variety of “unuseful” things such as rants, advertising, and limited comparisons. And it’s actually not that easy to give a new webhost a try. For one thing, these hosting companies don’t necessarily have the most liberal refund policies you could imagine. And, switching a domain name between different hosts and registrars is a complicated process through which a name may remain “hostage.” Had I realized what was involved, I might have used a domain name to which I have no attachment or actually eschewed the whole domain transition and just try the webhost without a dedicated domain name.

Doh!
Live and learn. I sure do. Loving almost every minute of it.

At any rate, I had a relatively hard time finding my webhost.

I really didn’t need “bells and whistles.” For instance, all the AdSense, shopping cart, and other business-oriented features which seem to be publicized by most webhosting companies have no interest, to me.

I didn’t even care so much about absolute degree of reliability or speed. What I’m to do with this host is fairly basic stuff. The core idea is to use my own host to bypass some limitations. For instance, WordPress.com doesn’t allow for plugins yet most of the WordPress fun has to do with plugins.

I did want an “unlimited” host, as much as possible. Not because expect to have huge resource needs but I just didn’t want to have to monitor bandwidth.

I thought that my needs would be basic enough that any cPanel-enabled webhost would fit. As much as I could see, I needed FTP access to something which had PHP 5 and MySQL 5. I expected to install things myself, without use of the webhost’s scripts but I also thought the host would have some useful scripts. Although I had already registered the domain I wanted to use (through Name.com), I thought it might be useful to have a free domain in the webhosting package. Not that domain names are expensive, it’s more of a matter of convenience in terms of payment or setup.

I ended up with FatCow. But, honestly, I’d probably go with a different host if I were to start over (which I may do with another project).

I paid 88$ for two years of “unlimited” hosting, which is quite reasonable. And, on paper, FatCow has everything I need (and I bunch of things I don’t need). The missing parts aren’t anything major but have to do with minor annoyances. In other words, no real deal-breaker, here. But there’s a few things I wish I had realized before I committed on FatCow with a domain name I actually want to use.

Something which was almost a deal-breaker for me is the fact that FatCow requires payment for any additional subdomain. And these aren’t cheap: the minimum is 5$/month for five subdomains, up to 25$/month for unlimited subdomains! Even at a “regular” price of 88$/year for the basic webhosting plan, the “unlimited subdomains” feature (included in some webhosting plans elsewhere) is more than three times more expensive than the core plan.

As I don’t absolutely need extra subdomains, this is mostly a minor irritant. But it’s one reason I’ll probably be using another webhost for other projects.

Other issues with FatCow are probably not enough to motivate a switch.

For instance, the PHP version installed on FatCow (5.2.1) is a few minor releases behind the one needed by some interesting web applications. No biggie, especially if PHP is updated in a relatively reasonable timeframe. But still makes for a slight frustration.

The MySQL version seems recent enough, but it uses non-standard tools to manage it, which makes for some confusion. Attempting to create some MySQL databases with obvious names (say “wordpress”) fails because the database allegedly exists (even though it doesn’t show up in the MySQL administration). In the same vein, the URL of the MySQL is <username>.fatcowmysql.com instead of localhost as most installers seem to expect. Easy to handle once you realize it, but it makes for some confusion.

In terms of Fantastico-like simplified installation of webapps, FatCow uses InstallCentral, which looks like it might be its own Fantastico replacement. InstallCentral is decent enough as an installation tool and FatCow does provide for some of the most popular blog and CMS platforms. But, in some cases, the application version installed by FatCow is old enough (2005!)  that it requires multiple upgrades to get to a current version. Compared to other installation tools, FatCow’s InstallCentral doesn’t seem really efficient at keeping track of installed and released versions.

Something which is partly a neat feature and partly a potential issue is the way FatCow handles Apache-related security. This isn’t something which is so clear to me, so I might be wrong.

Accounts on both BlueHost and HostGator include a public_html directory where all sorts of things go, especially if they’re related to publicly-accessible content. This directory serves as the website’s root, so one expects content to be available there. The “index.html” or “index.php” file in this directory serves as the website’s frontpage. It’s fairly obvious, but it does require that one would understand a few things about webservers. FatCow doesn’t seem to create a public_html directory in a user’s server space. Or, more accurately, it seems that the root directory (aka ‘/’) is in fact public_html. In this sense, a user doesn’t have to think about which directory to use to share things on the Web. But it also means that some higher-level directories aren’t available. I’ve already run into some issues with this and I’ll probably be looking for a workaround. I’m assuming there’s one. But it’s sometimes easier to use generally-applicable advice than to find a custom solution.

Further, in terms of access control… It seems that webapps typically make use of diverse directories and .htaccess files to manage some forms of access controls. Unix-style file permissions are also involved but the kind of access needed for a web app is somewhat different from the “User/Group/All” of Unix filesystems. AFAICT, FatCow does support those .htaccess files. But it has its own tools for building them. That can be a neat feature, as it makes it easier, for instance, to password-protect some directories. But it could also be the source of some confusion.

There are other issues I have with FatCow, but it’s probably enough for now.

So… On to the installation process… 😉

It only takes a few minutes and is rather straightforward. This is the most verbose version of that process you could imagine…

Surprised? 😎

Disclaimer: I’m mostly documenting how I did it and there are some things about which I’m unclear. So it may not work for you. If it doesn’t, I may be able to help but I provide no guarantee that I will. I’m an anthropologist, not a Web development expert.

As always, YMMV.

A few instructions here are specific to FatCow, but the general process is probably valid on other hosts.

I’m presenting things in a sequence which should make sense. I used a slightly different order myself, but I think this one should still work. (If it doesn’t, drop me a comment!)

In these instructions, straight quotes (“”) are used to isolate elements from the rest of the text. They shouldn’t be typed or pasted.

I use “example.com” to refer to the domain on which the installation is done. In my case, it’s the domain name I transfered to FatCow from another registrar but it could probably be done without a dedicated domain (in which case it would be “<username>.fatcow.com” where “<username>” is your FatCow username).

I started with creating a MySQL database for WordPress MU. FatCow does have phpMyAdmin but the default tool in the cPanel is labeled “Manage MySQL.” It’s slightly easier to use for creating new databases than phpMyAdmin because it creates the database and initial user (with confirmed password) in a single, easy-to-understand dialog box.

So I created that new database, user, and password, noting down this information. Since that password appears in clear text at some point and can easily be changed through the same interface, I used one which was easy to remember but wasn’t one I use elsewhere.
Then, I dowloaded the following files to my local machine in order to upload them to my FatCow server space. The upload can be done through either FTP or FatCow’s FileManager. I tend to prefer FTP (via CyberDuck on the Mac or FileZilla on PC). But the FileManager does allow for easy uploads.
(Wish it could be more direct, using the HTTP links directly instead of downloading to upload. But I haven’t found a way to do it through either FTP or the FileManager.)
At any rate, here are the four files I transfered to my FatCow space, using .zip when there’s a choice (the .tar.gz “tarball” versions also work but require a couple of extra steps).
  1. WordPress MU (wordpress-mu-2.9.1.1.zip, in my case)
  2. Buddymatic (buddymatic.0.9.6.3.1.zip, in my case)
  3. EarlyMorning (only one version, it seems)
  4. EarlyMorning-BP (only one version, it seems)

Only the WordPress MU archive is needed to install BuddyPress. The last three files are needed for EarlyMorning, a BuddyPress theme that I found particularly neat. It’s perfectly possible to install BuddyPress without this specific theme. (Although, doing so, you need to install a BuddyPress-compatible theme, if only by moving some folders to make the default theme available, as I explained in point 15 in that previous tutorial.) Buddymatic itself is a theme framework which includes some child themes, so you don’t need to install EarlyMorning. But installing it is easy enough that I’m adding instructions related to that theme.

These files can be uploaded anywhere in my FatCow space. I uploaded them to a kind of test/upload directory, just to make it clear, for me.

A major FatCow idiosyncrasy is its FileManager (actually called “FileManager Beta” in the documentation but showing up as “FileManager” in the cPanel). From my experience with both BlueHost and HostGator (two well-known webhosting companies), I can say that FC’s FileManager is quite limited. One thing it doesn’t do is uncompress archives. So I have to resort to the “Archive Gateway,” which is surprisingly slow and cumbersome.

At any rate, I used that Archive Gateway to uncompress the four files. WordPress µ first (in the root directory or “/”), then both Buddymatic and EarlyMorning in “/wordpress-mu/wp-content/themes” (you can chose the output directory for zip and tar files), and finally EarlyMorning-BP (anywhere, individual files are moved later). To uncompress each file, select it in the dropdown menu (it can be located in any subdirectory, Archive Gateway looks everywhere), add the output directory in the appropriate field in the case of Buddymatic or EarlyMorning, and press “Extract/Uncompress”. Wait to see a message (in green) at the top of the window saying that the file has been uncompressed successfully.

Then, in the FileManager, the contents of the EarlyMorning-BP directory have to be moved to “/wordpress-mu/wp-content/themes/earlymorning”. (Thought they could be uncompressed there directly, but it created an extra folder.) To move those files in the FileManager, I browse to that earlymorning-bp directory, click on the checkbox to select all, click on the “Move” button (fourth from right, marked with a blue folder), and add the output path: /wordpress-mu/wp-content/themes/earlymorning

These files are tweaks to make the EarlyMorning theme work with BuddyPress.

Then, I had to change two files, through the FileManager (it could also be done with an FTP client).

One change is to EarlyMorning’s style.css:

/wordpress-mu/wp-content/themes/earlymorning/style.css

There, “Template: thematic” has to be changed to “Template: buddymatic” (so, “the” should be changed to “buddy”).

That change is needed because the EarlyMorning theme is a child theme of the “Thematic” WordPress parent theme. Buddymatic is a BuddyPress-savvy version of Thematic and this changes the child-parent relation from Thematic to Buddymatic.

The other change is in the Buddymatic “extensions”:

/wordpress-mu/wp-content/themes/buddymatic/library/extensions/buddypress_extensions.php

There, on line 39, “$bp->root_domain” should be changed to “bp_root_domain()”.

This change is needed because of something I’d consider a bug but that a commenter on another blog was kind enough to troubleshoot. Without this modification, the login button in BuddyPress wasn’t working because it was going to the website’s root (example.com/wp-login.php) instead of the WPµ installation (example.com/wordpress-mu/wp-login.php). I was quite happy to find this workaround but I’m not completely clear on the reason it works.

Then, something I did which might not be needed is to rename the “wordpress-mu” directory. Without that change, the BuddyPress installation would sit at “example.com/wordpress-mu,” which seems a bit cryptic for users. In my mind, “example.com/<name>,” where “<name>” is something meaningful like “social” or “community” works well enough for my needs. Because FatCow charges for subdomains, the “<name>.example.com” option would be costly.

(Of course, WPµ and BuddyPress could be installed in the site’s root and the frontpage for “example.com” could be the BuddyPress frontpage. But since I think of BuddyPress as an add-on to a more complete site, it seems better to have it as a level lower in the site’s hierarchy.)

With all of this done, the actual WPµ installation process can begin.

The first thing is to browse to that directory in which WPµ resides, either “example.com/wordpress-mu” or “example.com/<name>” with the “<name>” you chose. You’re then presented with the WordPress µ Installation screen.

Since FatCow charges for subdomains, it’s important to choose the following option: “Sub-directories (like example.com/blog1).” It’s actually by selecting the other option that I realized that FatCow restricted subdomains.

The Database Name, username and password are the ones you created initially with Manage MySQL. If you forgot that password, you can actually change it with that same tool.

An important FatCow-specific point, here, is that “Database Host” should be “<username>.fatcowmysql.com” (where “<username>” is your FatCow username). In my experience, other webhosts use “localhost” and WPµ defaults to that.

You’re asked to give a name to your blog. In a way, though, if you think of BuddyPress as more of a platform than a blogging system, that name should be rather general. As you’re installing “WordPress Multi-User,” you’ll be able to create many blogs with more specific names, if you want. But the name you’re entering here is for BuddyPress as a whole. As with <name> in “example.com/<name>” (instead of “example.com/wordpress-mu”), it’s a matter of personal opinion.

Something I noticed with the EarlyMorning theme is that it’s a good idea to keep the main blog’s name relatively short. I used thirteen characters and it seemed to fit quite well.

Once you’re done filling in this page, WPµ is installed in a flash. You’re then presented with some information about your installation. It’s probably a good idea to note down some of that information, including the full paths to your installation and the administrator’s password.

But the first thing you should do, as soon as you log in with “admin” as username and the password provided, is probably to the change that administrator password. (In fact, it seems that a frequent advice in the WordPress community is to create a new administrator user account, with a different username than “admin,” and delete the “admin” account. Given some security issues with WordPress in the past, it seems like a good piece of advice. But I won’t describe it here. I did do it in my installation and it’s quite easy to do in WPµ.

Then, you should probably enable plugins here:

example.com/<name>/wp-admin/wpmu-options.php#menu

(From what I understand, it might be possible to install BuddyPress without enabling plugins, since you’re logged in as the administrator, but it still makes sense to enable them and it happens to be what I did.)

You can also change a few other options, but these can be set at another point.

One option which is probably useful, is this one:

Allow new registrations Disabled
Enabled. Blogs and user accounts can be created.
Only user account can be created.

Obviously, it’s not necessary. But in the interest of opening up the BuddyPress to the wider world without worrying too much about a proliferation of blogs, it might make sense. You may end up with some fake user accounts, but that shouldn’t be a difficult problem to solve.

Now comes the installation of the BuddyPress plugin itself. You can do so by going here:

example.com/<name>/wp-admin/plugin-install.php

And do a search for “BuddyPress” as a term. The plugin you want was authored by “The BuddyPress Community.” (In my case, version 1.1.3.) Click the “Install” link to bring up the installation dialog, then click “Install Now” to actually install the plugin.

Once the install is done, click the “Activate” link to complete the basic BuddyPress installation.

You now have a working installation of BuddyPress but the BuddyPress-savvy EarlyMorning isn’t enabled. So you need to go to “example.com/<name>/wp-admin/wpmu-themes.php” to enable both Buddymatic and EarlyMorning. You should then go to “example.com/<name>/wp-admin/themes.php” to activate the EarlyMorning theme.

Something which tripped me up because it’s now much easier than before is that forums (provided through bbPress) are now, literally, a one-click install. If you go here:

example.com/<name>/wp-admin/admin.php?page=bb-forums-setup

You can set up a new bbPress install (“Set up a new bbPress installation”) and everything will work wonderfully in terms of having forums fully integrated in BuddyPress. It’s so seamless that I wasn’t completely sure it had worked.

Besides this, I’d advise that you set up a few widgets for the BuddyPress frontpage. You do so through an easy-to-use drag-and-drop interface here:

example.com/<name>/wp-admin/widgets.php

I especially advise you to add the Twitter RSS widget because it seems to me to fit right in. If I’m not mistaken, the EarlyMorning theme contains specific elements to make this widget look good.

After that, you can just have fun with your new BuddyPress installation. The first thing I did was to register a new user. To do so, I logged out of my admin account,  and clicked on the Sign Up button. Since I “allow new registrations,” it’s a very simple process. In fact, this is one place where I think that BuddyPress shines. Something I didn’t explain is that you can add a series of fields for that registration and the user profile which goes with it.

The whole process really shouldn’t take very long. In fact, the longest parts have probably to do with waiting for Archive Gateway.

The rest is “merely” to get people involved in your BuddyPress installation. It can happen relatively easily, if you already have a group of people trying to do things together online. But it can be much more complicated than any software installation process… 😉

Teaching Models: A Response on “Teaching Naked”

A Response to Pamthropologist at Teaching Anthropology

Teaching Anthropology: Teaching Naked: Another one of those nothing new here movements.

[Had to split my response into several comments because of Blogger’s character limit. Thought I might as well post it here.]

Thanks for the ping. No problem about the way you do it. In fact, feel free to use my name. I use “Informal Ethnographer” accounts for social media stuff having to do with ethnographic disciplines, but this is more about pedagogy.

The main thing I noticed about this piece is that the author transforms an interesting and potentially insightful story about problems facing a large number of academic institutions “going forward” into one of those sterile debates about the causal relationships between technology and learning.

Apart from all those things we’ve discussed about teaching method (including the fact that I still use the boring PPT-lecture on occasion), there’s a lot of room for discussion about the “educational industry” not getting a hint from the recording and journalism industries. Because we’re academics, it’s great to deconstruct the technological determinism embedded in many of these discussions. But there’s also something rather pressing in terms of social change: the World in which we live is significantly different from the one in which we were born, when it comes to information. It relates to “information technology” but it goes way beyond tools.

And this is where I talk about surfing the wave instead of fighting it (or building windmills instead of shelters).

As I said elsewhere, I’ve only been teaching for ten years. When I started, in Fall 1999 at Indiana University Bloomington, it was both a baptism by fire and a culture shock. Many teachers complain about a “sense of entitlement” they get from their students, or about the consumer-based approach in academic institutions. There’s a number of discussion about average class size or students-to-teacher ratio. Some talk about a so-called “me generation.” Others moan about the fact that students bring laptops in class or that teachers are forced to use tools that they don’t want to use.

These are really not recent problems. However, they are different problems from the ones for which I was prepared.

I’d still say that they affect some institutions more than others (typically: prestigious universities in the United States). But they’re spreading throughout higher education.

Bowen perceives a specific problem: campus-based universities face competition from inexpensive and even free material online. As a dean, he wants to focus on the added value of campus experience, with a focus on the classroom as a context for discussion. It seems that he was hired precisely as an agent of change, just like some “mercurial CEOs” are hired when a corporation is in trouble.

The plan is relatively creative. Not so much in the restrictions on PPT use, but on the overall approach to differentiate his institution. It’s a marketing ploy, not a PR one.

As for the specifics of people’s concepts of “lecturing”… It seems that the mainstream notion about lecture is for a linear presentation with little or no interaction possible. Other teaching methods may involve some “lecturing,” but it seems that the core notion people are discussing is really this soliloquy mode of the teacher exposing ideas without input from the audience. One way to put it is that it’s a genre of performance, like a “stand-up” or an opera.

As a subgenre, “PowerPoint lectures” may deserve special consideration. As we all know, it’s quite possible to use PPT in ways which are creative, engaging, fun, deep, etc. But there are many <a href=”http://www.jstor.org/pss/674535″>keys</a&gt; to the “PowerPoint lecture” frame. One is the use of some kind of  “visual aid.” Another is the use of different slides as key timeposts in the performance. Or we could think about the fact that control over the actual PPT file strengthens the role differentiation between “lecturer” and “audience.” Not to mention the fact that it’s quite difficult to use PPT slides when everyone is in a circle.

So, yes, I’m giving some credence to the notion that PPT is a significant part of the lecturing model people are discussing <em>ad nauseam</em>.

Much of these discussions may relate to the perception that this performance genre (what I would call “straight lecture” or «cours magistral») is dominant, at institutions of higher education. The preponderance of a given teaching style across a wide array of institutions, disciplines, and “levels” would merit careful assessment, but the perception is there. “People” (the general population of the United States, the <em>Chronicle</em>’s readership, English-speakers…?) get the impression that what teachers do is mostly: stand in front of a class to talk by themselves for significant amounts of time with, maybe, a few questions thrown in at the end. Some people say that such “lectures” may not be incredibly effective. But the notion is still there. You may call this a “straw man,” but it’s been built a while ago.

Now… There are many ways to go from this whole concept of “straight lecturing.” One is the so-called “switcharound”: you go from lecturing (as a mode) to discussion or to group activities (as distinct modes). The notion, there, is apparently about the fact that “studies have shown” that, at this point in time, English-speaking students in the United States can’t concentrate for more than 20 minutes at the time. Or some such.

I reacted quite strongly when I heard this. For several reasons, including my personal experience of paying attention during class meetings lasting seven hours or more, some of which involving very limited interaction. I also reacted because I found the 50 minute period very constraining. And I always react to the “studies have shown” stance, that I find deeply problematic at an epistemological level. Is this really how we gain knowledge?

But I digress…

Another way to avoid “straight lectures” is to make lecturing itself more interactive. Many people have been doing this for a while. Chances are, it was done by a number of people during the 19th Century, as the “modern classroom” was invented. It can be remarkably effective and it seems to be quite underrated. An important thing to note: it’s significantly different from what people have in mind, when they talk about “lecturing.” In fact, in a workshop I attended, the simple fact that a teacher was moving around the classroom as he was teaching has been used as an example of an alternative to lecturing. Seems to me that most teachers do something like this. But it’s useful to think about the implications of using such “alternative methods.” Personally, though I frequently think about those methods and I certainly respect those who use them, I don’t tend to focus so much on this. I do use “alternative lecturing methods” like these, on occasion but, when I lecture, I tend to adopt the classical approach.

Common alternatives to lecturing, mentioned in the CHE piece, include “seminars, practical sessions, and group discussions,” These all tend to be quite difficult to do in the… “lecture” hall. Even with smaller classes, a large room may be an obstacle. Though it’s not impossible to have, say, group discussions in an auditorium, few of us really end up doing it on a regular basis. I’m “guilty” of that: I have much less small-group discussions in rooms in which desks can’t be moved.

As for seminars, it’s clearly my favourite teaching mode/method and I tend to extend the concept too much. Though I tend to be critical of those rigid “factors” like class size, I keep bumping into a limit to seminar size and I run into major hurdles when I try to get more than 25 students working in a seminar mode.

We could also talk about distance education as an alternative to lecturing, though much of it has tended to be lecture-based. Distance education is interesting in many respects. While it’s really not new, it seems like it has been expanding a lot in the fairly recent past. Regardless of the number of people getting degrees through distance learning, it mostly seems that the concept has become much more accepted by the general population (in English-speaking contexts, at least) and some programmes in distance learning seem to be getting more “cred” than ever before. I don’t want to overstate this expansion but it’s interesting to think about the possible connections with social change. Telecommuting, students working full-time, combining studying with childcare, homestudy, rising tuition costs, customer-based approaches to education, the “me generation,” the ease of transmitting complex data online, etc.

Even when distance learners have to watch lectures, distance education can be conceived as an alternative to the “straight lecture.” Practical details such as scheduling aren’t insignificant, but there are more profound implications to the fact that lectures aren’t “delivered in a lecture hall.” To go back to the performance genre, there’s a difference between a drama piece and a movie. Both can be good, but they have very different implications.

My implication with distance learning has to do with online learning. Last summer, I began teaching sociology to nursing students in Texas. From Montreal. I had been thinking about online teaching for a while and I’ve always had an online component to my courses. But last year was the first time I was able to teach a course without ever meeting those students.

My impression is that the rise of online education was the main thing Bowen had in mind. He clearly seems to think that this rise will only continue and that it may threaten campus-based institutions if they don’t do anything about it. The part which is surprising about his approach is that he actually advocates blended learning. Though we may disagree with Bowen on several points, it’d be difficult to compare him to an ostrich.

All of these approaches and methods have been known for a while. They all have their own advantages and they all help raise different issues. But they’ve been tested rather extensively by generation upon generation of teachers.

The focus, today, seems to be on a new set of approaches. Most of them have direct ties to well-established teaching models like seminars and distance education. So, they’re not really “new.” Yet they combine different things in such a way that they clearly require experimentation. We can hail them as “the future” or dismiss them as “trendy,” but they still afford some consideration as avenues for experimentation.

Many of them can be subsumed under the umbrella term “blended learning.” That term can mean different things to different people and some use it as a kind of buzzword. Analytically, it’s still a useful term.

Nellie Muller Deutsch is among those people who are currently doing PhD research on blended learning. We’ve had a number of discussions through diverse online groups devoted to learning and teaching. It’s possible that my thinking has been influenced by Nellie, but I was already interested in those topics long before interacting with her.

“Blended learning” implies some combinaison of classroom and online interactions between learners and teachers. The specific degree of “blending” varies a lot between contexts, but the basic concept remains. One might even argue that any educational context is blended, nowadays, since most teachers end up responding to at least “a few emails” (!) every semester. But the extensible concept of the “blended campus” easily goes beyond those direct exchanges.

What does this have to do with lectures? A lot, actually. Especially for those who have in mind a “monolithic” model for lecture-based courses, often forgetting (as many students do!) the role of office hours and other activities outside of the classroom.

Just as it’s possible but difficult to do a seminar in a lecture hall, it’s possible but difficult to do “straight lecture” in blended learning. Those professors and adjuncts who want to have as little interactions with students as possible may end up complaining about the amount of email they receive. In a sense, they’re “victims” of the move to a blended environment. One of the most convincing ideas I’ve heard in a teaching workshop was about moving email exchanges with individual students to forums, so that everyone can more effectively manage the channels of communication. Remarkably simple and compatible with many teaching styles. And a very reasonable use of online tools.

Bowen was advocating a very specific model for blended learning: students work with required readings on their own (presumably, using coursepacks and textbooks), read/watch/listen to lecture material online, and convene in the classroom to work with the material. His technique for making sure that students don’t “skip class” (which seems important in the United States, for some reason) is to give multiple-choice quizzes. Apart from justifying presence on campus (in the competition with distance learning), Bowen’s main point is about spending as much face-to-face time as possible in discussions. It’s not really an alternative to lectures if there are lectures online, but it’s a clear shift in focus from the “straight lecture” model. Fairly creative and it’s certainly worth some experimentation. But it’s only one among many possible approaches.

At least for the past few years, I’ve been posting material online both after and ahead of class meetings. I did notice a slight decrease in attendance, but that tends to matter very little for me. I also notice that many students tend to be more reluctant to go online to do things for my courses than one would expect from most of the discussions at an abstract level. But it’s still giving me a lot, including in terms of not having to rehash the same material over and over again (and again, <em>ad nauseam</em>).

I wouldn’t really call my approach “blended learning” because, in most of my upper-level courses at least, there’s still fairly little interaction happening online. But I do my part to experiment with diverse methods and approaches.

So…

None of this is meant to be about evaluating different approaches to teaching. I’m really not saying that my approach is better than anybody else’s. But I will say that it’s an appropriate fit with my perspective on learning as well as with my activities outside of the classroom. In other words, it’s not because I’m a geek that I expect anybody else to become a geek. I do, however, ask others to accept me as a geek.

And, Pamthropologist, you provided on my blog some context for several of the comments you’ve been making about lecturing. I certainly respect you and I think I understand what’s going on. In fact, I get the impression that you’re very effective at teaching anthropology and I wish your award-winning blog entry also carried an award for teaching. The one thing I find most useful, in all of this, is that you do discuss those issues. IMHO, the most important thing isn’t to find what the best model is but to discuss learning and teaching in a thoughtful manner so that everyone gets a voice. The fact that one of the most recent comments on your blog comes from a student in the Philippines speaks volumes about your openness.

Social Networks and Microblogging

Event-based microblogging and the social dimensions of online social networks.

Microblogging (Laconica, Twitter, etc.) is still a hot topic. For instance, during the past few episodes of This Week in Tech, comments were made about the preponderance of Twitter as a discussion theme: microblogging is so prominent on that show that some people complain that there’s too much talk about Twitter. Given the centrality of Leo Laporte’s podcast in geek culture (among Anglos, at least), such comments are significant.

The context for the latest comments about TWiT coverage of Twitter had to do with Twitter’s financials: during this financial crisis, Twitter is given funding without even asking for it. While it may seem surprising at first, given the fact that Twitter hasn’t publicized a business plan and doesn’t appear to be profitable at this time, 

Along with social networking, microblogging is even discussed in mainstream media. For instance, Médialogues (a media critique on Swiss national radio) recently had a segment about both Facebook and Twitter. Just yesterday, Comedy Central’s The Daily Show with Jon Stewart made fun of compulsive twittering and mainstream media coverage of Twitter (original, Canadian access).

Clearly, microblogging is getting some mindshare.

What the future holds for microblogging is clearly uncertain. Anything can happen. My guess is that microblogging will remain important for a while (at least a few years) but that it will transform itself rather radically. Chances are that other platforms will have microblogging features (something Facebook can do with status updates and something Automattic has been trying to do with some WordPress themes). In these troubled times, Montreal startup Identi.ca received some funding to continue developing its open microblogging platform.  Jaiku, bought by Google last year, is going open source, which may be good news for microblogging in general. Twitter itself might maintain its “marketshare” or other players may take over. There’s already a large number of third-party tools and services making use of Twitter, from Mahalo Answers to Remember the Milk, Twistory to TweetDeck.

Together, these all point to the current importance of microblogging and the potential for further development in that sphere. None of this means that microblogging is “The Next Big Thing.” But it’s reasonable to expect that microblogging will continue to grow in use.

(Those who are trying to grok microblogging, Common Craft’s Twitter in Plain English video is among the best-known descriptions of Twitter and it seems like an efficient way to “get the idea.”)

One thing which is rarely mentioned about microblogging is the prominent social structure supporting it. Like “Social Networking Systems” (LinkedIn, Facebook, Ning, MySpace…), microblogging makes it possible for people to “connect” to one another (as contacts/acquaintances/friends). Like blogs, microblogging platforms make it possible to link to somebody else’s material and get notifications for some of these links (a bit like pings and trackbacks). Like blogrolls, microblogging systems allow for lists of “favourite authors.” Unlike Social Networking Systems but similar to blogrolls, microblogging allow for asymmetrical relations, unreciprocated links: if I like somebody’s microblogging updates, I can subscribe to those (by “following” that person) and publicly show my appreciation of that person’s work, regardless of whether or not this microblogger likes my own updates.

There’s something strangely powerful there because it taps the power of social networks while avoiding tricky issues of reciprocity, “confidentiality,” and “intimacy.”

From the end user’s perspective, microblogging contacts may be easier to establish than contacts through Facebook or Orkut. From a social science perspective, microblogging links seem to approximate some of the fluidity found in social networks, without adding much complexity in the description of the relationships. Subscribing to someone’s updates gives me the role of “follower” with regards to that person. Conversely, those I follow receive the role of “following” (“followee” would seem logical, given the common “-er”/”-ee” pattern). The following and follower roles are complementary but each is sufficient by itself as a useful social link.

Typically, a microblogging system like Twitter or Identi.ca qualifies two-way connections as “friendship” while one-way connections could be labelled as “fandom” (if Andrew follows Betty’s updates but Betty doesn’t follow Andrew’s, Andrew is perceived as one of Betty’s “fans”). Profiles on microblogging systems are relatively simple and public, allowing for low-involvement online “presence.” As long as updates are kept public, anybody can connect to anybody else without even needing an introduction. In fact, because microblogging systems send notifications to users when they get new followers (through email and/or SMS), subscribing to someone’s update is often akin to introducing yourself to that person. 

Reciprocating is the object of relatively intense social pressure. A microblogger whose follower:following ratio is far from 1:1 may be regarded as either a snob (follower:following much higher than 1:1) or as something of a microblogging failure (follower:following much lower than 1:1). As in any social context, perceived snobbery may be associated with sophistication but it also carries opprobrium. Perry Belcher  made a video about what he calls “Twitter Snobs” and some French bloggers have elaborated on that concept. (Some are now claiming their right to be Twitter Snobs.) Low follower:following ratios can result from breach of etiquette (for instance, ostentatious self-promotion carried beyond the accepted limit) or even non-human status (many microblogging accounts are associated to “bots” producing automated content).

The result of the pressure for reciprocation is that contacts are reciprocated regardless of personal relations.  Some users even set up ways to automatically follow everyone who follows them. Despite being tricky, these methods escape the personal connection issue. Contrary to Social Networking Systems (and despite the term “friend” used for reciprocated contacts), following someone on a microblogging service implies little in terms of friendship.

One reason I personally find this fascinating is that specifying personal connections has been an important part of the development of social networks online. For instance, long-defunct SixDegrees.com (one of the earliest Social Networking Systems to appear online) required of users that they specified the precise nature of their relationship to users with whom they were connected. Details escape me but I distinctly remember that acquaintances, colleagues, and friends were distinguished. If I remember correctly, only one such personal connection was allowed for any pair of users and this connection had to be confirmed before the two users were linked through the system. Facebook’s method to account for personal connections is somewhat more sophisticated despite the fact that all contacts are labelled as “friends” regardless of the nature of the connection. The uniform use of the term “friend” has been decried by many public commentators of Facebook (including in the United States where “friend” is often applied to any person with whom one is simply on friendly terms).

In this context, the flexibility with which microblogging contacts are made merits consideration: by allowing unidirectional contacts, microblogging platforms may have solved a tricky social network problem. And while the strength of the connection between two microbloggers is left unacknowledged, there are several methods to assess it (for instance through replies and republished updates).

Social contacts are the very basis of social media. In this case, microblogging represents a step towards both simplified and complexified social contacts.

Which leads me to the theme which prompted me to start this blogpost: event-based microblogging.

I posted the following blog entry (in French) about event-based microblogging, back in November.

Microblogue d’événement

I haven’t received any direct feedback on it and the topic seems to have little echoes in the social media sphere.

During the last PodMtl meeting on February 18, I tried to throw my event-based microblogging idea in the ring. This generated a rather lengthy between a friend and myself. (Because I don’t want to put words in this friend’s mouth, who happens to be relatively high-profile, I won’t mention this friend’s name.) This friend voiced several objections to my main idea and I got to think about this basic notion a bit further. At the risk of sounding exceedingly opinionated, I must say that my friend’s objections actually comforted me in the notion that my “event microblog” idea makes a lot of sense.

The basic idea is quite simple: microblogging instances tied to specific events. There are technical issues in terms of hosting and such but I’m mostly thinking about associating microblogs and events.

What I had in mind during the PodMtl discussion has to do with grouping features, which are often requested by Twitter users (including by Perry Belcher who called out Twitter Snobs). And while I do insist on events as a basis for those instances (like groups), some of the same logic applies to specific interests. However, given the time-sensitivity of microblogging, I still think that events are more significant in this context than interests, however defined.

In the PodMtl discussion, I frequently referred to BarCamp-like events (in part because my friend and interlocutor had participated in a number of such events). The same concept applies to any event, including one which is just unfolding (say, assassination of Guinea-Bissau’s president or bombings in Mumbai).

Microblogging users are expected to think about “hashtags,” those textual labels preceded with the ‘#’ symbol which are meant to categorize microblogging updates. But hashtags are problematic on several levels.

  • They require preliminary agreement among multiple microbloggers, a tricky proposition in any social media. “Let’s use #Bissau09. Everybody agrees with that?” It can get ugly and, even if it doesn’t, the process is awkward (especially for new users).
  • Even if agreement has been reached, there might be discrepancies in the way hashtags are typed. “Was it #TwestivalMtl or #TwestivalMontreal, I forgot.”
  • In terms of language economy, it’s unsurprising that the same hashtag would be used for different things. Is “#pcmtl” about Podcamp Montreal, about personal computers in Montreal, about PCM Transcoding Library…?
  • Hashtags are frequently misunderstood by many microbloggers. Just this week, a tweep of mine (a “peep” on Twitter) asked about them after having been on Twitter for months.
  • While there are multiple ways to track hashtags (including through SMS, in some regions), there is no way to further specify the tracked updates (for instance, by user).
  • The distinction between a hashtag and a keyword is too subtle to be really useful. Twitter Search, for instance, lumps the two together.
  • Hashtags take time to type. Even if microbloggers aren’t necessarily typing frantically, the time taken to type all those hashtags seems counterproductive and may even distract microbloggers.
  • Repetitively typing the same string is a very specific kind of task which seems to go against the microblogging ethos, if not the cognitive processes associated with microblogging.
  • The number of character in a hashtag decreases the amount of text in every update. When all you have is 140 characters at a time, the thirteen characters in “#TwestivalMtl” constitute almost 10% of your update.
  • If the same hashtag is used by a large number of people, the visual effect can be that this hashtag is actually dominating the microblogging stream. Since there currently isn’t a way to ignore updates containing a certain hashtag, this effect may even discourage people from using a microblogging service.

There are multiple solutions to these issues, of course. Some of them are surely discussed among developers of microblogging systems. And my notion of event-specific microblogs isn’t geared toward solving these issues. But I do think separate instances make more sense than hashtags, especially in terms of specific events.

My friend’s objections to my event microblogging idea had something to do with visibility. It seems that this friend wants all updates to be visible, regardless of the context. While I don’t disagree with this, I would claim that it would still be useful to “opt out” of certain discussions when people we follow are involved. If I know that Sean is participating in a PHP conference and that most of his updates will be about PHP for a period of time, I would enjoy the possibility to hide PHP-related updates for a specific period of time. The reason I talk about this specific case is simple: a friend of mine has manifested some frustration about the large number of updates made by participants in Podcamp Montreal (myself included). Partly in reaction to this, he stopped following me on Twitter and only resumed following me after Podcamp Montreal had ended. In this case, my friend could have hidden Podcamp Montreal updates and still have received other updates from the same microbloggers.

To a certain extent, event-specific instances are a bit similar to “rooms” in MMORPG and other forms of real-time many-to-many text-based communication such as the nostalgia-inducing Internet Relay Chat. Despite Dave Winer’s strong claim to the contrary (and attempt at defining microblogging away from IRC), a microblogging instance could, in fact, act as a de facto chatroom. When such a structure is needed. Taking advantage of the work done in microblogging over the past year (which seems to have advanced more rapidly than work on chatrooms has, during the past fifteen years). Instead of setting up an IRC channel, a Web-based chatroom, or even a session on MSN Messenger, users could use their microblogging platform of choice and either decide to follow all updates related to a given event or simply not “opt-out” of following those updates (depending on their preferences). Updates related to multiple events are visible simultaneously (which isn’t really the case with IRC or chatrooms) and there could be ways to make event-specific updates more prominent. In fact, there would be easy ways to keep real-time statistics of those updates and get a bird’s eye view of those conversations.

And there’s a point about event-specific microblogging which is likely to both displease “alpha geeks” and convince corporate users: updates about some events could be “protected” in the sense that they would not appear in the public stream in realtime. The simplest case for this could be a company-wide meeting during which backchannel is allowed and even expected “within the walls” of the event. The “nothing should leave this room” attitude seems contradictory to social media in general, but many cases can be made for “confidential microblogging.” Microblogged conversations can easily be archived and these archives could be made public at a later date. Event-specific microblogging allows for some control of the “permeability” of the boundaries surrounding the event. “But why would people use microblogging instead of simply talking to another?,” you ask. Several quick answers: participants aren’t in the same room, vocal communication is mostly single-channel, large groups of people are unlikely to communicate efficiently through oral means only, several things are more efficiently done through writing, written updates are easier to track and archive…

There are many other things I’d like to say about event-based microblogging but this post is already long. There’s one thing I want to explain, which connects back to the social network dimension of microblogging.

Events can be simplistically conceived as social contexts which bring people together. (Yes, duh!) Participants in a given event constitute a “community of experience” regardless of the personal connections between them. They may be strangers, ennemies, relatives, acquaintances, friends, etc. But they all share something. “Participation,” in this case, can be relatively passive and the difference between key participants (say, volunteers and lecturers in a conference) and attendees is relatively moot, at a certain level of analysis. The key, here, is the set of connections between people at the event.

These connections are a very powerful component of social networks. We typically meet people through “events,” albeit informal ones. Some events are explicitly meant to connect people who have something in common. In some circles, “networking” refers to something like this. The temporal dimension of social connections is an important one. By analogy to philosophy of language, the “first meeting” (and the set of “first impressions”) constitute the “baptism” of the personal (or social) connection. In social media especially, the nature of social connections tends to be monovalent enough that this “baptism event” gains special significance.

The online construction of social networks relies on a finite number of dimensions, including personal characteristics described in a profile, indirect connections (FOAF), shared interests, textual content, geographical location, and participation in certain activities. Depending on a variety of personal factors, people may be quite inclusive or rather exclusive, based on those dimensions. “I follow back everyone who lives in Austin” or “Only people I have met in person can belong to my inner circle.” The sophistication with which online personal connections are negotiated, along such dimensions, is a thing of beauty. In view of this sophistication, tools used in social media seem relatively crude and underdeveloped.

Going back to the (un)conference concept, the usefulness of having access to a list of all participants in a given event seems quite obvious. In an open event like BarCamp, it could greatly facilitate the event’s logistics. In a closed event with paid access, it could be linked to registration (despite geek resistance, closed events serve a purpose; one could even imagine events where attendance is free but the microblogging backchannel incurs a cost). In some events, everybody would be visible to everybody else. In others, there could be a sort of ACL for diverse types of participants. In some cases, people could be allowed to “lurk” without being seen while in others radically transparency could be enforced. For public events with all participants visible, lists of participants could be archived and used for several purposes (such as assessing which sessions in a conference are more popular or “tracking” event regulars).

One reason I keep thinking about event-specific microblogging is that I occasionally use microblogging like others use business cards. In a geek crowd, I may ask for someone’s Twitter username in order to establish a connection with that person. Typically, I will start following that person on Twitter and find opportunities to communicate with that person later on. Given the possibility for one-way relationships, it establishes a social connection without requiring personal involvement. In fact, that person may easily ignore me without the danger of a face threat.

If there were event-specific instances from microblogging platforms, we could manage connections and profiles in a more sophisticated way. For instance, someone could use a barebones profile for contacts made during an impersonal event and a full-fledged profile for contacts made during a more “intimate” event. After noticing a friend using an event-specific business card with an event-specific email address, I got to think that this event microblogging idea might serve as a way to fill a social need.

 

More than most of my other blogposts, I expect comments on this one. Objections are obviously welcomed, especially if they’re made thoughtfully (like my PodMtl friend made them). Suggestions would be especially useful. Or even questions about diverse points that I haven’t addressed (several of which I can already think about).

So…

 

What do you think of this idea of event-based microblogging? Would you use a microblogging instance linked to an event, say at an unconference? Can you think of fun features an event-based microblogging instance could have? If you think about similar ideas you’ve seen proposed online, care to share some links?

 

Thanks in advance!

Back in Mac: Low End Edition

I’m happy to go “back in Mac,” even on a low end machine.

Today, I’m buying an old Mac mini G4 1.25GHz. Yes, a low end computer from 2005. It’ll be great to be back in Mac after spending most of my computer life on XP for three years.

This mini is slower than my XP desktop (emachines H3070). But that doesn’t really matter for what I want to do.

There’s something to be said about computers being “fast enough.” Gamers and engineers may not grok this concept, since they always want more. But there’s a point at which computers don’t really need to be faster, for some categories of uses.

Car analogies are often made, in computer discussions, and this case seems fairly obvious. Some cars are still designed to “push the envelope,” in terms of performance. Yet most cars, including some relatively inexpensive ones, are already fast enough to run on highways beyond the speed limits in North America. Even in Europe, most drivers don’t tend to push their cars to the limit. Something vaguely similar happens with computers, though there are major differences. For instance, the difference in cost between fast driving and normal driving is a factor with cars while it isn’t so much of a factor with computers. With computers, the need for cooling and battery power (on laptops) do matter but, even if they were completely solved, there’s a limit to the power needed for casual computer use.

This isn’t contradicting Moore’s Law directly. Chips do increase exponentially in speed-to-cost ratio. But the effects aren’t felt the same way through all uses of computers, especially if we think about casual use of desktop and laptop “personal computers.” Computer chips in other devices (from handheld devices to cars or DVD players) benefit from Moore’s Law, but these are not what we usually mean by “computer,” in daily use.
The common way to put it is something like “you don’t need a fast machine to do email and word processing.”

The main reason I needed a Mac is that I’ll be using iMovie to do simple video editing. Video editing does push the limits of a slow computer and I’ll notice those limits very readily. But it’ll still work, and that’s quite interesting to think about, in terms of the history of personal computing. A Mac mini G4 is a slug, in comparison with even the current Mac mini Core 2 Duo. But it’s fast enough for even some tasks which, in historical terms, have been processor-intensive.

None of this is meant to say that the “need for speed” among computer users is completely manufactured. As computers become more powerful, some applications of computing technologies which were nearly impossible at slower speeds become easy to do. In fact, there certainly are things which we don’t even imagine becoming which will be easy to do in the future, thanks to improvements in computer chip performance. Those who play processor-intensive games always want faster machines and they certainly feel the “need for speed.” But, it seems to me, the quest for raw speed isn’t the core of personal computing, anymore.

This all reminds me of the Material Culture course I was teaching in the Fall: the Social Construction of Technology, Actor-Network Theory, the Social Shaping of Technology, etc.

So, a low end computer makes sense.

While iMovie is the main reason I decided to get a Mac at this point, I’ve been longing for Macs for three years. There were times during which I was able to use somebody else’s Mac for extended periods of time but this Mac mini G4 will be the first Mac to which I’ll have full-time access since late 2005, when my iBook G3 died.

As before, I’m happy to be “back in Mac.” I could handle life on XP, but it never felt that comfortable and I haven’t been able to adapt my workflow to the way the Windows world works. I could (and probably should) have worked on Linux, but I’m not sure it would have made my life complete either.

Some things I’m happy to go back to:

  • OmniOutliner
  • GarageBand
  • Keynote
  • Quicksilver
  • Nisus Thesaurus
  • Dictionary
  • Preview
  • Terminal
  • TextEdit
  • BibDesk
  • iCal
  • Address Book
  • Mail
  • TAMS Analyzer
  • iChat

Now I need to install some RAM in this puppy.

My Year in Social Media

In some ways, this post is a belated follow-up to my last blogpost about some of my blog statistics:

Almost 30k « Disparate.

In the two years since I published that post, I’ve received over 100 000 visits on this blog and I’ve diversified my social media activities.

Altogether, 2008 has been an important year, for me, in terms of social media. I began the year in Austin, TX and moved back to Quebec in late April. Many things have happened in my personal life and several of them have been tied to my social media activities.

The most important part of my social media life, through 2008 as through any year, is the contact I have with diverse people. I’ve met a rather large number of people in 2008 and some of these people have become quite important in my life. In fact, there are people I have met in 2008 whose impact on my life makes it feel as though we have been friends for quite a while. Many of these contacts have happened through social media or, at least, they have been mediated online. As a “people person,” a social butterfly, a humanist, and a social scientist, I care more about these people I’ve met than about the tools I’ve used.

Obviously, most of the contacts I’ve had through the year were with people I already knew. And my relationship with many of these people has changed quite significantly through the year. As is obvious for anyone who knows me, 2008 has been an important year in my personal life. A period of transition. My guess is that 2009 will be even more important, personally.

But this post is about my social media activities. Especially about (micro)blogging and about social networking, in my case. I also did a couple of things in terms of podcasting and online video, but my main activities online tend to be textual. This might change a bit in 2009, but probably not much. I expect 2009 to be an “incremental evolution” in terms of my social media activities. In fact, I mostly want to intensify my involvement in social media spheres, in continuity with what I’ve been doing in 2008.

So it’s the perfect occasion to think back about 2008.

Perhaps my main highlight of 2008 in terms of social media is Twitter. You can say I’m a late adopter to Twitter. I’ve known about it since it came out and I probably joined Twitter a while ago but I really started using it in preparation for SXSWi and BarCampAustin, in early March of this year. As I wanted to integrate Austin’s geek scene and Twitter clearly had some importance in that scene, I thought I’d “play along.” Also, I didn’t have a badge for SXSWi but I knew I could learn about off-festival events through Twitter. And Twitter has become rather important, for me.

For one thing, it allows me to make a distinction between actual blogposts and short thoughts. I’ve probably been posting fewer blog entries since I became active on Twitter and my blogposts are probably longer, on average, than they were before. In a way, I feel it enhances my blogging experience.

Twitter also allows me to “take notes in public,” a practise I find surprisingly useful. For instance, when I go to some kind of presentation (academic or otherwise) I use Twitter to record my thoughts on both the event and the content. This practise is my version of “liveblogging” and I enjoy it. On several occasions, these liveblogging sessions have been rather helpful. Some “tweeps” (Twitter+peeps) dislike this kind of liveblogging practise and claim that “Twitter isn’t meant for this,” but I’ve had more positive experiences through liveblogging on Twitter than negative ones.

The device which makes all of this liveblogging possible, for me, is the iPod touch I received from a friend in June of this year. It has had important implications for my online life and, to a certain extent, the ‘touch has become my primary computer. The iTunes App Store, which opened its doors in July, has changed the game for me as I was able to get a number of dedicated applications, some of which I use several times a day. I’ve blogged about several things related to the iPod touch and the whole process has changed my perspective on social media in general. Of course, an iPhone would be an even more useful tool for me: SMS, GPS, camera, and ubiquitous Internet are all useful features in connection to social media. But, for now, the iPod touch does the trick. Especially through Twitter and Facebook.

One tool I started using quite frequently through the year is Ping.fm. I use it to post to: Twitter, Identi.ca, Facebook, LinkedIn, Brightkite, Jaiku, FriendFeed, Blogger, and WordPress.com (on another blog). I receive the most feedback on Facebook and Twitter but I occasionally get feedback through the other services (including through Pownce, which was recently sold). One thing I notice through this cross-posting practise is that, on these different services, the same activity has a range of implications. For instance, while I’m mostly active on Twitter, I actually get more out of Facebook postings (status updates, posted items, etc.). And reactions on different services tend to be rather different, as the relationships I have with people who provide that feedback tend to range from indirect acquaintance to “best friend forever.” Given my social science background, I find these differences quite interesting to think about.

One thing I’ve noticed on Twitter is that my “ranking among tweeps” has increased very significantly. On Twinfluence, my rank has gone as high as the 86th percentile (though it recently went down to the 79th percentile) while, on Twitter Grader, my “Twitter grade” is now at a rather unbelievable 98.1%. I don’t tend to care much about “measures of influence” but I find these ratings quite interesting. One reason is that they rely on relatively sophisticated concepts from social sciences. Another reason is that I’m intrigued by what causes increases in my ranking on those services. In this case, I think the measures give me way too much credit at this point but I also think that my “influence” is found outside of Twitter.

One “sphere of influence” which remained important for me through 2008 is Facebook. While Facebook had a more central role in my life through 2007, it now represents a stable part of my social media involvement. One thing which tends to happen is that first contacts happen through Twitter (I often use it as the equivalent of a business card during event) and Facebook represents a second step in the relationship. In a way, this distinction foregrounds the obvious concept of “intimacy” in social media. Twitter is public, ties are weak. Facebook is intimate, ties are stronger. On the other hand, there seems to be much more clustering among my tweeps than among my Facebook contacts, in part because my connection to local geek scenes in Austin and Montreal happens primarily through Twitter.

Through Facebook I was able to organize a fun little brunch with a few friends from elementary school. Though this brunch may not have been the most important event of 2008, for me, I’ve learnt a lot about the power of social media through contacting these friends, meeting them, and thinking about the whole affair.

In a way, Twitter and Facebook have helped me expand my social media activities in diverse directions. But most of the important events in my social media life in 2008 have been happening offline. Several of these events were unconferences and informal events happening around conferences.

My two favourite events of the year, in terms of social media, were BarCampAustin and PodCamp Montreal. Participating in (and observing) both events has had some rather profound implications in my social media life. These two unconferences were somewhat different but both were probably as useful, to me. One regret I have is that it’s unlikely that I’ll be able to attend BarCampAustinIV now that I’ve left Austin.

Other events have happened throughout 2008 which I find important in terms of social media. These include regular meetings like Yulblog, Yulbiz, and PodMtl. There are many other events which aren’t necessarily tied to social media but that I find interesting from a social media perspective. The recent Infopresse360 conference on innovation (with Malcolm Gladwell as keynote speaker) and a rather large number of informal meetups with people I’ve known through social media would qualify.

Despite the diversification of my social media life through 2008, blogging remains my most important social media activity. I now consider myself a full-fledged blogger and I think that my blog is representative of something about me.

Simply put, I’m proud to be a blogger. 

In 2008, a few things have happened through my blog which, I think, are rather significant. One is that someone who found me through Google contacted me directly about a contract in private-sector ethnography. As I’m currently going through professional reorientation, I take this contract to be rather significant. It’s actually possible that the Google result this person noticed wasn’t directly about my blog (the ranking of my diverse online profiles tends to shift around fairly regularly) but I still associate online profiles with blogging.

A set of blog-related occurences which I find significant has to do with the fact that my blog has been at the centre of a number of discussions with diverse people including podcasters and other social media people. My guess is that some of these discussions may lead to some interesting things for me in 2009.

Through 2008, this blog has become more anthropological. For several reasons, I wish to maintain it as a disparate blog, a blog about disparate topics. But it still participates in my gaining some recognition as an anthroblogger. One reason is that anthrobloggers are now more closely connected than before. Recently, anthroblogger Daniel Lende has sent a call for nominations for the best of the anthro blogosphere which he then posted as both a “round up” and a series of prizes. Before that, Savage Minds had organized an “awards ceremony” for an academic conference. And, perhaps the most important dimension of my ow blog being recognized in the anthroblogosphere, I have been discussing a number of things with Concordia-based anthrobloggers Owen Wiltshire and Maximilian Forte.

Still, anthropology isn’t the most prominent topic on this blog. In fact, my anthro-related posts tend to receive relatively little attention, outside of discussions with colleagues.

Since I conceive of this post as a follow-up on posts about statistics, I’ve gone through some of my stats here on Disparate.  Upgrades to  Wordpress.com also allow me to get a more detailed picture of what has been happening on this blog.

Through 2008, I’ve received over 55 131 hits on this blog, about 11% more than in 2007 for an average of 151 hits a day (I actually thought it was more but there are some days during which I receive relatively few hits, especially during weekends). The month I received the most hits was February 2007 with 5 967 hits but February and March 2008 were relatively close. The day I received the most hits was October 28, 2008, with 310 hits. This was the day after Myriade opened.

These numbers aren’t so significant. For one thing, hits don’t imply that people have read anything on my blog. Since all of my blogs are ad-free, I haven’t tried to increase traffic to this blog. But it’s still interesting to notice a few things.

The most obvious thing is that hits to rather silly posts are much more frequent than hits to posts I actually care about.

For instance, my six blogposts with the most hits:

Title Hits  
Facebook Celebs and Fakes 5 782 More stats
emachines Power Supply 4 800 More stats
Recording at 44.1 kHz, 16b with iPod 5G? 2 834 More stats
Blogspot v. WordPress.com, Blogger v. Wo 2 571 More stats
GERD and Stress 2 377 More stats
University Rankings and Diversity 2 219 More stats

And for 2008:

Title Hits  
Facebook Celebs and Fakes 3 984 More stats
emachines Power Supply 2 265 More stats
AT&T Yahoo Pro DSL to Belkin WiFi 1 527 More stats
GERD and Stress 1 430 More stats
Blogspot v. WordPress.com, Blogger v. Wo 1 151 More stats
University Rankings and Diversity 995 More stats

The Facebook post I wrote very quickly in July 2007. It was a quick reaction to something I had heard. Obviously, the post’s title  is the single reason for that post’s popularity. I get an average of 11 hits a day on that post for 4 001 hits in 2008. If I wanted to increase traffic, I’d post as many of these as possible.

The emachines post is my first post on this new blog (but I did import posts from my previous blog), back in January 2006. It seems to have helped a few people and gets regular traffic (six hits a day, in 2008). It’s not my most thoughtful post but it has its place. It’s still funny to notice that traffic to this blogpost increases even though one would assume it’s less relevant.

Rather unsurprisingly, my post about then-upcoming recording capabilities on the iPod 5G, from March 2006, is getting very few hits. But, for a while, it did get a number of hits (six a day in 2006) and I was a bit puzzled by that.

The AT&T post is my most popular post written in 2008. It was a simple troubleshooting session, like the aforementioned emachines post. These posts might be useful for some people and I occasionally get feedback from people about them. Another practical post regularly getting a few hits is about an inflatable mattress with built-in pump which came without clear instructions.

My post about blogging platform was in fact a repost of a comment I made on somebody else’s blog entry (though the original seems to be lost). From what I can see, it was most popular from June, 2007 through May, 2008. Since it was first posted, WordPress.com has been updated quite a bit and Blogger/Blogspot seems to have pretty much stalled. My comment/blogpost on the issue is fairly straightforward and it has put me in touch with some other bloggers.

The other two blogposts getting the most hits in 2008 are closer to things about which I care. Both entries were written in mid-2006 and are still relevant. The rankings post is short on content, but it serves as an “anchor” for some things I like to discuss in terms of educational institutions. The GERD post is among my most personal posts on this blog, especially in English. It’s one of the posts for which I received the most feedback. My perspective on the issue hasn’t changed much in the meantime.

Privilege: Library Edition

When I came out against privilege, over a month ago, I wasn’t thinking about libraries. But, last week, while running some errands at three local libraries (within an hour), I got to think about library privileges.

During that day, I first started thinking about library privileges because I was renewing my CREPUQ card at Concordia. With that card, graduate students and faculty members at a university in Quebec are able to get library privileges at other universities, a nice “perk” that we have. While renewing my card, I was told (or, more probably, reminded) that the card now gives me borrowing privileges at any university library in Canada through CURBA (Canadian University Reciprocal Borrowing Agreement).

My gut reaction: “Aw-sum!” (I was having a fun day).

It got me thinking about what it means to be an academic in Canada. Because I’ve also spent part of my still short academic career in the United States, I tend to compare the Canadian academe to US academic contexts. And while there are some impressive academic consortia in the US, I don’t think that any of them may offer as wide a set of library privileges as this one. If my count is accurate, there are 77 institutions involved in CURBA. University systems and consortia in the US typically include somewhere between ten and thirty institutions, usually within the same state or region. Even if members of both the “UC System” and “CalState” have similar borrowing privileges, it would only mean 33 institutions, less than half of CURBA (though the population of California is about 20% more than that of Canada as a whole). Some important university consortia through which I’ve had some privileges were the CIC (Committee on Institutional Cooperation), a group of twelve Midwestern universities, and the BLC (Boston Library Consortium), a group of twenty university in New England. Even with full borrowing privileges in all three groups of university libraries, an academic would only have access to library material from 65 institutions.

Of course, the number of institutions isn’t that relevant if the libraries themselves have few books. But my guess is that the average size of a Canadian university’s library collection is quite comparable to its US equivalents, including in such well-endowed institutions as those in the aforementioned consortia and university systems. What’s more, I would guess that there might be a broader range of references across Canadian universities than in any region of the US. Not to mention that BANQ (Quebec’s national library and archives) are part of CURBA and that their collections overlap very little with a typical university library.

So, I was thinking about access to an extremely wide range of references given to graduate students and faculty members throughout Canada. We get this very nice perk, this impressive privilege, and we pretty much take it for granted.

Which eventually got me to think about my problem with privilege. Privilege implies a type of hierarchy with which I tend to be uneasy. Even (or especially) when I benefit from a top position. “That’s all great for us but what about other people?”

In this case, there are obvious “Others” like undergraduate students at Canadian institutions,  Canadian non-academics, and scholars at non-Canadian institutions. These are very disparate groups but they are all denied something.

Canadian undergrads are the most direct “victims”: they participate in Canada’s academe, like graduate students and faculty members, yet their access to resources is severely limited by comparison to those of us with CURBA privileges. Something about this strikes me as rather unfair. Don’t undegrads need access as much as we do? Is there really such a wide gap between someone working on an honour’s thesis at the end of a bachelor’s degree and someone starting work on a master’s thesis that the latter requires much wider access than the former? Of course, the main rationale behind this discrepancy in access to library material probably has to do with sheer numbers: there are many undergraduate students “fighting for the same resources” and there are relatively few graduate students and faculty members who need access to the same resources. Or something like that. It makes sense but it’s still a point of tension, as any matter of privilege.

The second set of “victims” includes Canadians who happen to not be affiliated directly with an academic institution. While it may seem that their need for academic resources are more limited than those of students, many people in this category have a more unquenchable “thirst for knowledge” than many an academic. In fact, there are people in this category who could probably do a lot of academically-relevant work “if only they had access.” I mostly mean people who have an academic background of some sort but who are currently unaffiliated with formal institutions. But the “broader public” counts, especially when a specific topic becomes relevant to them. These are people who take advantage of public libraries but, as mentioned in the BANQ case, public and university libraries don’t tend to overlap much. For instance, it’s quite unlikely that someone without academic library privileges would have been able to borrow Visual Information Processing (Chase, William 1973), a proceedings book that I used as a source for a recent blogpost on expertise. Of course, “the public” is usually allowed to browse books in most university libraries in North America (apart from Harvard). But, depending on other practical factors, borrowing books can be much more efficient than browsing them in a library. I tend to hear from diverse people who would enjoy some kind of academic status for this very reason: library privileges matter.

A third category of “victims” of CURBA privileges are non-Canadian academics. Since most of them may only contribute indirectly to Canadian society, why should they have access to Canadian resources? As any social context, the national academe defines insiders and outsiders. While academics are typically inclusive, this type of restriction seems to make sense. Yet many academics outside of Canada could benefit from access to resources broadly available to Canadian academics. In some cases, there are special agreements to allow outside scholars to get temporary access to local, regional, or national resources. Rather frequently, these agreements come with special funding, the outside academic being a special visitor, sometimes with even better access than some local academics.  I have very limited knowledge of these agreements (apart from infrequent discussions with colleagues who benefitted from them) but my sense is that they are costly, cumbersome, and restrictive. Access to local resources is even more exclusive a privilege in this case than in the CURBA case.

Which brings me to my main point about the issue: we all need open access.

When I originally thought about how impressive CURBA privileges were, I was thinking through the logic of the physical library. In a physical library, resources are scarce, access to resources need to be controlled, and library privileges have a high value. In fact, it costs an impressive amount of money to run a physical library. The money universities invest in their libraries is relatively “inelastic” and must figure quite prominently in their budgets. The “return” on that investment seems to me a bit hard to measure: is it a competitive advantage, does a better-endowed library make a university more cost-effective, do university libraries ever “recoup” any portion of the amounts spent?

Contrast all of this with a “virtual” library. My guess is that an online collection of texts costs less to maintain than a physical library by any possible measure. Because digital data may be copied at will, the notion of “scarcity” makes little sense online. Distributing millions of copies of a digital text doesn’t make the original text unavailable to anyone. As long as the distribution system is designed properly, the “transaction costs” in distributing a text of any length are probably much less than those associated with borrowing a book.  And the differences between “browsing” and “borrowing,” which do appear significant with physical books, seem irrelevant with digital texts.

These are all well-known points about online distribution. And they all seem to lead to the same conclusion: “information wants to be free.” Not “free as in beer.” Maybe not even “free as in speech.” But “free as in unchained.”

Open access to academic resources is still a hot topic. Though I do consider myself an advocate of “OA” (the “Open Access movement”), what I mean here isn’t so much about OA as opposed to TA (“toll-access”) in the case of academic journals. Physical copies of periodicals may usually not be borrowed, regardless of library privileges, and online resources are typically excluded from borrowing agreements between institutions. The connection between OA and my perspective on library privileges is that I think the same solution could solve both issues.

I’ve been thinking about a “global library” for a while. Like others, the Library of Alexandria serves as a model but texts would be online. It sounds utopian but my main notion, there, is that “library privileges” would be granted to anyone. Not only senior scholars at accredited academic institutions. Anyone. Of course, the burden of maintaining that global library would also be shared by anyone.

There are many related models, apart from the Library of Alexandria: French «Encyclopédistes» through the Englightenment, public libraries, national libraries (including the Library of Congress), Tim Berners-Lee’s original “World Wide Web” concept, Brewster Kahle’s Internet Archive, Google Books, etc. Though these models differ, they all point to the same basic idea: a “universal” collection with the potential for “universal” access. In historical perspective, this core notion of a “universal library” seems relatively stable.

Of course, there are many obstacles to a “global” or “universal” library. Including issues having to do with conflicts between social groups across the Globe or the current state of so-called “intellectual property.” These are all very tricky and I don’t think they can be solved in any number of blogposts. The main thing I’ve been thinking about, in this case, is the implications of a global library in terms of privileges.

Come to think of it, it’s possible that much of the resistance to a global library have to do with privilege: unlike me, some people enjoy privilege.

My Problem With Journalism

I hate having an axe to grind. Really, I do. “It’s unlike me.” When I notice that I catch myself grinding an axe, I “get on my own case.” I can be quite harsh with my own self.

But I’ve been trained to voice my concerns. And I’ve been perceiving an important social problem for a while.

So I “can’t keep quiet about it.”

If everything goes really well, posting this blog entry might be liberating enough that I will no longer have any axe to grind. Even if it doesn’t go as well as I hope, it’ll be useful to keep this post around so that people can understand my position.

Because I don’t necessarily want people to agree with me. I mostly want them to understand “where I come from.”

So, here goes:

Journalism may have outlived its usefulness.

Like several other “-isms” (including nationalism, colonialism, imperialism, and racism) journalism is counterproductive in the current state of society.

This isn’t an ethical stance, though there are ethical positions which go with it. It’s a statement about the anachronic nature of journalism. As per functional analysis, everything in society needs a function if it is to be maintained. What has been known as journalism is now taking new functions. Eventually, “journalism as we know it” should, logically, make way for new forms.

What these new forms might be, I won’t elaborate in this post. I have multiple ideas, especially given well-publicised interests in social media. But this post isn’t about “the future of journalism.”

It’s about the end of journalism.

Or, at least, my looking forward to the end of journalism.

Now, I’m not saying that journalists are bad people and that they should just lose their jobs. I do think that those who were trained as journalists need to retool themselves, but this post isn’t not about that either.

It’s about an axe I’ve been grinding.

See, I can admit it, I’ve been making some rather negative comments about diverse behaviours and statements, by media people. It has even become a habit of mine to allow myself to comment on something a journalist has said, if I feel that there is an issue.

Yes, I know: journalists are people too, they deserve my respect.

And I do respect them, the same way I respect every human being. I just won’t give them the satisfaction of my putting them on a pedestal. In my mind, journalists are people: just like anybody else. They deserve no special treatment. And several of them have been arrogant enough that I can’t help turning their arrogance back to them.

Still, it’s not about journalist as people. It’s about journalism “as an occupation.” And as a system. An outdated system.

Speaking of dates, some context…

I was born in 1972 and, originally,I was quite taken by journalism.

By age twelve, I was pretty much a news junkie. Seriously! I was “consuming” a lot of media at that point. And I was “into” media. Mostly television and radio, with some print mixed in, as well as lots of literary work for context: this is when I first read French and Russian authors from the late 19th and early 20th centuries.

I kept thinking about what was happening in The World. Back in 1984, the Cold War was a major issue. To a French-Canadian tween, this mostly meant thinking about the fact that there were (allegedly) US and USSR “bombs pointed at us,” for reasons beyond our direct control.

“Caring about The World” also meant thinking about all sorts of problems happening across The Globe. Especially poverty, hunger, diseases, and wars. I distinctly remember caring about the famine in Ethiopia. And when We Are the World started playing everywhere, I felt like something was finally happening.

This was one of my first steps toward cynicism. And I’m happy it occured at age twelve because it allowed me to eventually “snap out of it.” Oh, sure, I can still be a cynic on occasion. But my cynicism is contextual. I’m not sure things would have been as happiness-inducing for me if it hadn’t been for that early start in cynicism.

Because, you see, The World disinterested itself quite rapidly with the plight of Ethiopians. I distinctly remember asking myself, after the media frenzy died out, what had happened to Ethiopians in the meantime. I’m sure there has been some report at the time claiming that the famine was over and that the situation was “back to normal.” But I didn’t hear anything about it, and I was looking. As a twelve-year-old French-Canadian with no access to a modem, I had no direct access to information about the situation in Ethiopia.

Ethiopia still remained as a symbol, to me, of an issue to be solved. It’s not the direct cause of my later becoming an africanist. But, come to think of it, there might be a connection, deeper down than I had been looking.

So, by the end of the Ethiopian famine of 1984-85, I was “losing my faith in” journalism.

I clearly haven’t gained a new faith in journalism. And it all makes me feel quite good, actually. I simply don’t need that kind of faith. I was already training myself to be a critical thinker. Sounds self-serving? Well, sorry. I’m just being honest. What’s a blog if the author isn’t honest and genuine?

Flash forward to 1991, when I started formal training in anthropology. The feeling was exhilarating. I finally felt like I belonged. My statement at the time was to the effect that “I wasn’t meant for anthropology: anthropology was meant for me!” And I was learning quite a bit about/from The World. At that point, it already did mean “The Whole Wide World,” even though my knowledge of that World was fairly limited. And it was a haven of critical thinking.

Ideal, I tell you. Moan all you want, it felt like the ideal place at the ideal time.

And, during the summer of 1993, it all happened: I learnt about the existence of the “Internet.” And it changed my life. Seriously, the ‘Net did have a large part to play in important changes in my life.

That event, my discovery of the ‘Net, also has a connection to journalism. The person who described the Internet to me was Kevin Tuite, one of my linguistic anthropology teachers at Université de Montréal. As far as I can remember, Kevin was mostly describing Usenet. But the potential for “relatively unmediated communication” was already a big selling point. Kevin talked about the fact that members of the Caucasian diaspora were able to use the Internet to discuss with their relatives and friends back in the Caucasus about issues pertaining to these independent republics after the fall of the USSR. All this while media coverage was sketchy at best (sounded like journalism still had a hard time coping with the new realities).

As you can imagine, I was more than intrigued and I applied for an account as soon as possible. In the meantime, I bought at 2400 baud modem, joined some local BBSes, and got to chat about the Internet with several friends, some of whom already had accounts. Got my first email account just before semester started, in August, 1993. I can still see traces of that account, but only since April, 1994 (I guess I wasn’t using my address in my signature before this). I’ve been an enthusiastic user of diverse Internet-based means of communication since then.

But coming back to journalism, specifically…

Journalism missed the switch.

During the past fifteen years, I’ve been amazed at how clueless members of mainstream media institutions have been to “the power of the Internet.” This was during Wired Magazine’s first year as a print magazine and we (some friends and I) were already commenting upon the fact that print journalists should look at what was coming. Eventually, they would need to adapt. “The Internet changes everything,” I thought.

No, I didn’t mean that the Internet would cause any of the significant changes that we have seeing around us. I tend to be against technological determinism (and other McLuhan tendencies). Not that I prefer sociological determinism yet I can’t help but think that, from ARPAnet to the current state of the Internet, most of the important changes have been primarily social: if the Internet became something, it’s because people are making it so, not because of some inexorable technological development.

My enthusiastic perspective on the Internet was largely motivated by the notion that it would allow people to go beyond the model from the journalism era. Honestly, I could see the end of “journalism as we knew it.” And I’m surprised, fifteen years later, that journalism has been among the slowest institutions to adapt.

In a sense, my main problem with journalism is that it maintains a very stratified structure which gives too much weight to the credibility of specific individuals. Editors and journalists, who are part of the “medium” in the old models of communication, have taken on a gatekeeping role despite the fact that they rarely are much more proficient thinkers than people who read them. “Gatekeepers” even constitute a “textbook case” in sociology, especially in conflict theory. Though I can easily perceive how “constructed” that gatekeeping model may be, I can easily relate to what it entails in terms of journalism.

There’s a type of arrogance embedded in journalistic self-perception: “we’re journalists/editors so we know better than you; you need us to process information for you.” Regardless of how much I may disagree with some of his words and actions, I take solace in the fact that Murdoch, a key figure in today’s mainstream media, talked directly at this arrogance. Of course, he might have been pandering. But the very fact that he can pay lip-service to journalistic arrogance is, in my mind, quite helpful.

I think the days of fully stratified gatekeeping (a “top-down approach” to information filtering) are over. Now that information is easily available and that knowledge is constructed socially, any “filtering” method can be distributed. I’m not really thinking of a “cream rises to the top” model. An analogy with water sources going through multiple layers of mountain rock would be more appropriate to a Swiss citizen such as myself. But the model I have in mind is more about what Bakhtin called “polyvocality” and what has become an ethical position on “giving voice to the other.” Journalism has taken voice away from people. I have in mind a distributed mode of knowledge construction which gives everyone enough voice to have long-distance effects.

At the risk of sounding too abstract (it’s actually very clear in my mind, but it requires a long description), it’s a blend of ideas like: the social butterfly effect, a post-encyclopedic world, and cultural awareness. All of these, in my mind, contribute to this heightened form of critical thinking away from which I feel journalism has led us.

The social butterfly effect is fairly easy to understand, especially now that social networks are so prominent. Basically, the “butterfly effect” from chaos theory applied to social networks. In this context, a “social butterfly” is a node in multiple networks of varying degrees of density and clustering. Because such a “social butterfly” can bring things (ideas, especially) from one such network to another, I argue that her or his ultimate influence (in agregate) is larger than that of someone who sits at the core of a highly clustered network. Yes, it’s related to “weak ties” and other network classics. But it’s a bit more specific, at least in my mind. In terms of journalism, the social butterfly effect implies that the way knowledge is constructed needs not come from a singular source or channel.

The “encyclopedic world” I have in mind is that of our good friends from the French Enlightenment: Diderot and the gang. At that time, there was a notion that the sum of all knowledge could be contained in the Encyclopédie. Of course, I’m simplifying. But such a notion is still discussed fairly frequently. The world in which we now live has clearly challenged this encyclopedic notion of exhaustiveness. Sure, certain people hold on to that notion. But it’s not taken for granted as “uncontroversial.” Actually, those who hold on to it tend to respond rather positively to the journalistic perspective on human events. As should be obvious, I think the days of that encyclopedic worldview are counted and that “journalism as we know it” will die at the same time. Though it seems to be built on an “encyclopedia” frame, Wikipedia clearly benefits from distributed model of knowledge management. In this sense, Wikipedia is less anachronistic than Britannica. Wikipedia also tends to be more insightful than Britannica.

The cultural awareness point may sound like an ethnographer’s pipe dream. But I perceive a clear connection between Globalization and a certain form of cultural awareness in information and knowledge management. This is probably where the Global Voices model can come in. One of the most useful representations of that model comes from a Chris Lydon’s Open Source conversation with Solana Larsen and Ethan Zuckerman. Simply put, I feel that this model challenges journalism’s ethnocentrism.

Obviously, I have many other things to say about journalism (as well as about its corrolate, nationalism).

But I do feel liberated already. So I’ll leave it at that.

Enthused Tech

Yesterday, I held a WiZiQ session on the use of online tech in higher education:

Enthusing Higher Education: Getting Universities and Colleges to Play with Online Tools and Services

Slideshare

(Full multimedia recording available here)

During the session, Nellie Deutsch shared the following link:

Diffusion of Innovations, by Everett Rogers (1995)

Haven’t read Rogers’s book but it sounds like a contextually easy to understand version of ideas which have been quite clear in Boasian disciplines (cultural anthropology, folkloristics, cultural ecology…) for a while. But, in this sometimes obsessive quest for innovation, it might in fact be useful to go back to basic ideas about the social mechanisms which can be observed in the adoption of new tools and techniques. It’s in fact the thinking behind this relatively recent blogpost of mine:

Technology Adoption and Active Reading

My emphasis during the WiZiQ session was on enthusiasm. I tend to think a lot about occasions in which, thinking about possibilities afforded technology relates to people getting “psyched up.” In a way, this is exactly how I can define myself as a tech enthusiast: I get easy psyched up in the context of discussions about technology.

What’s funny is that I’m no gadget freak. I don’t care about the tool. I just love to dream up possibilities. And I sincerely think that I’m not alone. We might even guess that a similar dream-induced excitement animates true gadget freaks, who must have the latest tool. Early adopters are a big part of geek culture and, though still small, geek culture is still a niche.

Because I know I’ll keep on talking about these things on other occasions, I can “leave it at that,” for now.

RERO‘s my battle cry.

TBC

Crazy App Idea: Happy Meter

I keep getting ideas for apps I’d like to see on Apple’s App Store for iPod touch and iPhone. This one may sound a bit weird but I think it could be fun. An app where you can record your mood and optionally broadcast it to friends. It could become rather sophisticated, actually. And I think it can have interesting consequences.

The idea mostly comes from Philippe Lemay, a psychologist friend of mine and fellow PDA fan. Haven’t talked to him in a while but I was just thinking about something he did, a number of years ago (in the mid-1990s). As part of an academic project, Philippe helped develop a PDA-based research program whereby subjects would record different things about their state of mind at intervals during the day. Apart from the neatness of the data gathering technique, this whole concept stayed with me. As a non-psychologist, I personally get the strong impression that recording your moods frequently during the day can actually be a very useful thing to do in terms of mental health.

And I really like the PDA angle. Since I think of the App Store as transforming Apple’s touch devices into full-fledged PDAs, the connection is rather strong between Philippe’s work at that time and the current state of App Store development.

Since that project of Philippe’s, a number of things have been going on which might help refine the “happy meter” concept.

One is that “lifecasting” became rather big, especially among certain groups of Netizens (typically younger people, but also many members of geek culture). Though the lifecasting concept applies mostly to video streams, there are connections with many other trends in online culture. The connection with vidcasting specifically (and podcasting generally) is rather obvious. But there are other connections. For instance, with mo-, photo-, or microblogging. Or even with all the “mood” apps on Facebook.

Speaking of Facebook as a platform, I think it meshes especially well with touch devices.

So, “happy meter” could be part of a broader app which does other things: updating Facebook status, posting tweets, broadcasting location, sending personal blogposts, listing scores in a Brain Age type game, etc.

Yet I think the “happy meter” could be useful on its own, as a way to track your own mood. “Turns out, my mood was improving pretty quickly on that day.” “Sounds like I didn’t let things affect me too much despite all sorts of things I was going through.”

As a mood-tracker, the “happy meter” should be extremely efficient. Because it’s easy, I’m thinking of sliders. One main slider for general mood and different sliders for different moods and emotions. It would also be possible to extend the “entry form” on occasion, when the user wants to record more data about their mental state.

Of course, everything would be save automatically and “sent to the cloud” on occasion. There could be a way to selectively broadcast some slider values. The app could conceivably send reminders to the user to update their mood at regular intervals. It could even serve as a “break reminder” feature. Though there are limitations on OSX iPhone in terms of interapplication communication, it’d be even neater if the app were able to record other things happening on the touch device at the same time, such as music which is playing or some apps which have been used.

Now, very obviously, there are lots of privacy issues involved. But what social networking services have taught us is that users can have pretty sophisticated notions of privacy management, if they’re given the chance. For instance, adept Facebook users may seem to indiscrimately post just about everything about themselves but are often very clear about what they want to “let out,” in context. So, clearly, every type of broadcasting should be controlled by the user. No opt-out here.

I know this all sounds crazy. And it all might be a very bad idea. But the thing about letting my mind wander is that it helps me remain happy.

Visualizing Touch Devices in Education

Took me a while before I watched this concept video about iPhone use on campus.

Connected: The Movie – Abilene Christian University

Sure, it’s a bit campy. Sure, some features aren’t available on the iPhone yet. But the basic concepts are pretty much what I had in mind.

Among things I like in the video:

  • The very notion of student empowerment runs at the centre of it.
  • Many of the class-related applications presented show an interest in the constructivist dimensions of learning.
  • Material is made available before class. Face-to-face time is for engaging in the material, not rehashing it.
  • The technology is presented as a way to ease the bureaucratic aspects of university life, relieving a burden on students (and, presumably, on everyone else involved).
  • The “iPhone as ID” concept is simple yet powerful, in context.
  • Social networks (namely Facebook and MySpace, in the video) are embedded in the campus experience.
  • Blended learning (called “hybrid” in the video) is conceived as an option, not as an obligation.
  • Use of the technology is specifically perceived as going beyond geek culture.
  • The scenarios (use cases) are quite realistic in terms of typical campus life in the United States.
  • While “getting an iPhone” is mentioned as a perk, it’s perfectly possible to imagine technology as a levelling factor with educational institutions, lowering some costs while raising the bar for pedagogical standards.
  • The shift from “eLearning” to “mLearning” is rather obvious.
  • ACU already does iTunes U.
  • The video is released under a Creative Commons license.

Of course, there are many directions things can go, from here. Not all of them are in line with the ACU dream scenario. But I’m quite hope judging from some apparently random facts: that Apple may sell iPhones through universities, that Apple has plans for iPhone use on campuses,  that many of the “enterprise features” of iPhone 2.0 could work in institutions of higher education, that the Steve Jobs keynote made several mentions of education, that Apple bundles iPod touch with Macs, that the OLPC XOXO is now conceived more as a touch handheld than as a laptop, that (although delayed) Google’s Android platform can participate in the same usage scenarios, and that browser-based computing apparently has a bright future.

Waiting for Other Touch Devices?

Though I’m interpreting Apple’s current back-to-school special to imply that we might not see radically new iPod touch models until September, I’m still hoping that there will be a variety of touch devices available in the not-so-distant future, whether or not Apple makes them.

Turns out, the rumour mill has some items related to my wish, including this one:

AppleInsider | Larger Apple multi-touch devices move beyond prototype stage

This could be excellent news for the device category as a whole and for Apple itself. As explained before, I’m especially enthusiastic about touch devices in educational contexts.

I’ve been lusting over an iPod touch since it was announced. I sincerely think that an iPod touch will significantly enhance my life. As strange as it may sound, especially given the fact I’m no gadget freak, I think frequently about the iPod touch. Think Wayne, in Wayne’s World 2, going to a music store to try a guitar (and being denied the privilege to play Stairway to Heaven). That’s almost me and the iPod touch. When I go to an Apple Store, I spend precious minutes with a touch.

Given my current pattern of computer use, the fact that I have no access to a laptop at this point, and the availability of WiFi connections at some interesting spots, I think an iPod touch will enable me to spend much less time in front of this desktop, spend much more time outside, and focus on my general well-being.

One important feature the touch has, which can have a significant effect on my life, is instant-on. My desktop still takes minutes to wake up from “Stand by.” Several times during the day, the main reason I wake my desktop is to make sure I haven’t received important email messages. (I don’t have push email.) For a number of reasons, what starts out as simple email-checking frequently ends up being a more elaborate browsing session. An iPod touch would greatly reduce the need for those extended sessions and let me “do other things with my life.”

Another reason a touch would be important in my life at this point is that I no longer have access to a working MP3 player. While I don’t technically need any portable media player to be happy, getting my first iPod just a few years ago was an important change in my life. I’ll still miss my late iRiver‘s recording capabilities, but it’s now possible to get microphone input on the iPod touch. Eventually, the iPod touch could become a very attractive tool for fieldwork recordings. Or for podcasting. Given my audio orientation, a recording-capable iPod touch could be quite useful. Even more so than iPod Classic with recording capabilities.

There are a number of other things which should make the iPod touch very useful in my life. A set of them have to do with expected features and applications. One is Omni Group’s intention to release their OmniFocus task management software through the iPhone SDK. As an enthusiastic user of OmniOutliner for most of the time I’ve spent on Mac OS X laptops, I can just imagine how useful OmniFocus could be on an iPod touch. Getting Things Done, the handheld version. It could help me streamline my whole workflow, the way OO used to do. In other words: OF on an iPod touch could be this fieldworker’s dream come true.

There are also applications to be released for Apple’s Touch devices which may be less “utilitarian” but still quite exciting. Including the Trism game. In terms of both “appropriate use of the platform” and pricing, Trism scores high on my list. I see it as an excellent example of what casual gaming can be like. One practical aspect of casual gaming, especially on such a flexible device as the iPod touch, is that it can greatly decrease stress levels by giving users “something to do while they wait.” I’ve had that experience with other handhelds. Whether it’s riding the bus or waiting for a computer to wake up from stand by, having something to do with your hands makes the situation just a tad bit more pleasant.

I’m also expecting some new features to eventually be released through software, including some advanced podcatching features like wireless synchronization of podcasts and, one can dream, a way to interact directly with podcast content. Despite having been an avid podcast listener for years, I think podcasts aren’t nearly “interactive” enough. Software on a touch device could solve this. But that part is wishful thinking. I tend to do a lot of wishlists. Sometimes, my daydreams become realities.

The cool thing is, it looks as though I’ll be able to get my own touch device in the near future. w00t! 😀

Even if Apple does release new Touch devices, the device I’m most likely to get is an iPod touch. Chances are that I might be able to get a used 8MB touch for a decent price. Especially if, as is expected for next Monday, Apple officially announces the iPhone for Canada (possibly with a very attractive data plan) As a friend was telling me, once Canadians are able to get their hands on an iPhone directly in Canada, there’ll likely be a number of used iPod touches for sale. With a larger supply of used iPod touches and a presumably lower demand for the same, we can expect a lower price.

Another reason I might get an iPod touch is that a friend of mine has been talking about helping me with this purchase. Though I feel a bit awkward about accepting this kind of help, I’m very enthusiastic at the prospect.

Watch this space for more on my touch life. 😉

Well-Rounded Bloggers

While I keep saying journalist have a tough time putting journalism in perspective, it seems that some blogging journalists are able to do it.

Case in point, ZDNet Editor in Chief Larry Dignan:

Anatomy of a ‘Blogging will kill you’ story: Why I didn’t make the cut | Between the Lines | ZDNet.com

I didn’t read the original NYT piece. On purpose. As I’ve tried to establish, I sometimes run away from things “everybody has read.” Typically, in the U.S., this means something which appeared in the NYT. To the extent that, for some people, “if it’s not in the Times, it didn’t happen.” (Such an attitude is especially tricky when you’re talking about, say, parts of Africa which aren’t at war.)

This time, I’m especially glad I read Dignan’s piece instead of the NYT one because I get the gist of the “story” and Dignan provides the kind of insight I enjoy.

Basic message: blogging can be as stressful as any job yet it’s possible to have a well-balanced life as a blogger.

Simple, useful, personal, insightful, and probably more accurate than the original piece.

Oh, sure. It’s nothing new. It’s not a major revelation for most people that it’s important to think about work/life balance.

Still… As it so happens, this specific piece helped me think about my own blogging activities in a somewhat different light. No, it’s not my job (though I do wish I had a writing job). And I don’t typically stress over it. I’m just thinking about where blogging fits in my life. And that’s helpful.

Even if it means yet another blogpost about blogging.

They Dropped The Other Shoe

[Disclaimer: I’m not necessarily an Apple fanboy but I have been an enthusiastic Mac user since 1987 and have owned several Apple products, from an iPod to a QuickTake camera. I also think that technology is having a big impact on arts, media, and entertainment.]

Just watched Apple’s "Showtime" Special Event. Didn’t really read or even listen to anything much about it yet. During that event, Apple CEO Steve Jobs introduced new versions of all the iPod models, a new version of iTunes, and the addition of movies to the iTunes store. In addition, Jobs gave a sneak peak of an upcoming box to link iTunes with televisions and stereo systems.

People are likely to have been disappointed by the announcements. They’re probably saying that Steve Jobs’s famous "Reality Distortion Field" isn’t working, or that he lost his "mojo." They might even wonder about his health. Again…

Not that the new products are really boring, but there tend to be high expectations surrounding Apple announcements. This one is no different as people expected wireless capabilities on iPods and recording capabilities on the new "media centre" box, which was in fact part of the expected new products from Apple.

But this event is significant in another way. Through it, Apple explained their strategy, revealed a number of years ago as the Digital Hub. What some have called "convergence," quite a few years ago. Nothing really new. It’s just coming into full focus.

Though we may never know how much of it unfolded as planned, Apple’s media/tech strategy may appear rather prescient in retrospect. IIRC, it started in 1996, during Gil Amelio’s tenure. Or, more probably, in 1997 during the switch between Amelio and Jobs. Even by, say, 1999, that strategy was still considered a bold move. That was before the first iPod which, itself, was before iTunes, the iTunes Music Store, and most other current media-centric technologies at Apple. It was also at a time when user-generated content was relatively unimportant. In other ways, that was during the "Web 1.0" Internet bubble, before the "Web 2.0" craze for blogs, podcasts, and "social networking."

Apple isn’t the only corporation involved in the changes in the convergence between technology and the world of "content" (arts, media, entertainment). But it has played a key role. Whatever his success as a CEO, Steve Jobs has influenced the direction of change and, to an extent, shape a part of digital life to his own liking. While he’s clearly not clueless, his vision of the link between "content" and technology is quite specific. It does integrate user-generated content of "varying degrees of professionalism" (which he joked about during his presentation) but it gives precedence to the "content industry" (involving such powerful groups and lobbies as WIPO, NAB, MPAA, RIAA, etc.). Jobs’s position at Pixar makes him a part of that industry. Which is quite different from what arts and expressive culture can be.

Jobs invites musicians on stage with him (John Mayer, Wynton Marsalis, John Legend). He respects musicians and he might even appreciate their work. But his view of their work is that they produce content to consumed. For Jobs, music tracks, audiobooks, television episodes, movies, and music videos are all "contents" to be enjoyed by consumers. Now, the consumer can enjoy content "anywhere" as Apple is "in your den, in your living-room, in your car, and in your pocket." But what about public spaces? Concert halls, churches, coffee shops, parks, public libraries, classrooms, etc.? Oh! Apple can be there too! Yeah, of course. But those are not part of the primary vision. In Apple’s view, consumers all have their own iTunes accounts, media libraries, preferences, and content-consuming habits. A nuclear family may count as a unit to a certain extent (as Bob Iger pointed out in his "cameo appearance" during Jobs’s event). But the default mode is private consumption.

And there’s nothing wrong with that. Even the coolest things online are often based on the same model. It’s just that it’s not the only way to do things. Music, for instance, can be performed in public. In fact, it can be a collaborative process. The performers themselves need not be professionals. There’s no need for an audience, even. And there’s no need to see it as "intellectual property." Music is not a product. It’s a process by which human beings organize sound.

Ah, well…

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Music, Food, Industries, Piracy

000ady6y (PNG Image, 200×125 pixels)

Noticed it in Steal This Film. A very appropriate message. Process over product. Music is not a commodity. Food does not grow on profits.

Blogged with Flock

RIAA: Still Clueless

Speaking of clues, Edgar Bronfman and his ilk still ain’t got none.
LimeWire in court: one thing leads to another

Nice Ars Technica intro:

Observe the indigenous RIAA in its native environment. Fresh off a kill, its thoughts to turn immediately to its next meal… thus the woolly tusked RIAA embodies the cycle of life.