Category Archives: ramblings

WordPress as Content Directory: Getting Somewhere

{I tend to ramble a bit. If you just want a step-by-step tutorial, you can skip to here.}

Woohoo!

I feel like I’ve reached a milestone in a project I’ve had in mind, ever since I learnt about Custom Post Types in WordPress 3.0: Using WordPress as a content directory.

The concept may not be so obvious to anyone else, but it’s very clear to me. And probably much clearer for anyone who has any level of WordPress skills (I’m still a kind of WP newbie).

Basically, I’d like to set something up through WordPress to make it easy to create, review, and publish entries in content databases. WordPress is now a Content Management System and the type of “content management” I’d like to enable has to do with something of a directory system.

Why WordPress? Almost glad you asked.

These days, several of the projects on which I work revolve around WordPress. By pure coincidence. Or because WordPress is “teh awsum.” No idea how representative my sample is. But I got to work on WordPress for (among other things): an academic association, an adult learners’ week, an institute for citizenship and social change, and some of my own learning-related projects.

There are people out there arguing about the relative value of WordPress and other Content Management Systems. Sometimes, WordPress may fall short of people’s expectations. Sometimes, the pro-WordPress rhetoric is strong enough to sound like fanboism. But the matter goes beyond marketshare, opinions, and preferences.

In my case, WordPress just happens to be a rather central part of my life, these days. To me, it’s both a question of WordPress being “the right tool for the job” and the work I end up doing being appropriate for WordPress treatment. More than a simple causality (“I use WordPress because of the projects I do” or “I do these projects because I use WordPress”), it’s a complex interaction which involves diverse tools, my skillset, my social networks, and my interests.

Of course, WordPress isn’t perfect nor is it ideal for every situation. There are cases in which it might make much more sense to use another tool (Twitter, TikiWiki, Facebook, Moodle, Tumblr, Drupal..). And there are several things I wish WordPress did more elegantly (such as integrating all dimensions in a single tool). But I frequently end up with WordPress.

Here are some things I like about WordPress:

This last one is where the choice of WordPress for content directories starts making the most sense. Not only is it easy for me to use and build on WordPress but the learning curves are such that it’s easy for me to teach WordPress to others.

A nice example is the post editing interface (same in the software and service). It’s powerful, flexible, and robust, but it’s also very easy to use. It takes a few minutes to learn and is quite sufficient to do a lot of work.

This is exactly where I’m getting to the core idea for my content directories.

I emailed the following description to the digital content editor for the academic organization for which I want to create such content directories:

You know the post editing interface? What if instead of editing posts, someone could edit other types of contents, like syllabi, calls for papers, and teaching resources? What if fields were pretty much like the form I had created for [a committee]? What if submissions could be made by people with a specific role? What if submissions could then be reviewed by other people, with another role? What if display of these items were standardised?

Not exactly sure how clear my vision was in her head, but it’s very clear for me. And it came from different things I’ve seen about custom post types in WordPress 3.0.

For instance, the following post has been quite inspiring:

I almost had a drift-off moment.

But I wasn’t able to wrap my head around all the necessary elements. I perused and read a number of things about custom post types, I tried a few things. But I always got stuck at some point.

Recently, a valuable piece of the puzzle was provided by Kyle Jones (whose blog I follow because of his work on WordPress/BuddyPress in learning, a focus I share).

Setting up a Staff Directory using WordPress Custom Post Types and Plugins | The Corkboard.

As I discussed in the comments to this post, it contained almost everything I needed to make this work. But the two problems Jones mentioned were major hurdles, for me.

After reading that post, though, I decided to investigate further. I eventually got some material which helped me a bit, but it still wasn’t sufficient. Until tonight, I kept running into obstacles which made the process quite difficult.

Then, while trying to solve a problem I was having with Jones’s code, I stumbled upon the following:

Rock-Solid WordPress 3.0 Themes using Custom Post Types | Blancer.com Tutorials and projects.

This post was useful enough that I created a shortlink for it, so I could have it on my iPad and follow along: http://bit.ly/RockSolidCustomWP

By itself, it might not have been sufficient for me to really understand the whole process. And, following that tutorial, I replaced the first bits of code with use of the neat plugins mentioned by Jones in his own tutorial: More Types, More Taxonomies, and More Fields.

I played with this a few times but I can now provide an actual tutorial. I’m now doing the whole thing “from scratch” and will write down all steps.

This is with the WordPress 3.0 blogging software installed on a Bluehost account. (The WordPress.com blogging service doesn’t support custom post types.) I use the default Twenty Ten theme as a parent theme.

Since I use WordPress Multisite, I’m creating a new test blog (in Super Admin->Sites, “Add New”). Of course, this wasn’t required, but it helps me make sure the process is reproducible.

Since I already installed the three “More Plugins” (but they’re not “network activated”) I go in the Plugins menu to activate each of them.

I can now create the new “Product” type, based on that Blancer tutorial. To do so, I go to the “More Types” Settings menu, I click on “Add New Post Type,” and I fill in the following information: post type names (singular and plural) and the thumbnail feature. Other options are set by default.

I also set the “Permalink base” in Advanced settings. Not sure it’s required but it seems to make sense.

I click on the “Save” button at the bottom of the page (forgot to do this, the last time).

I then go to the “More Fields” settings menu to create a custom box for the post editing interface.

I add the box title and change the “Use with post types” options (no use in having this in posts).

(Didn’t forget to click “save,” this time!)

I can now add the “Price” field. To do so, I need to click on the “Edit” link next to the “Product Options” box I just created and add click “Add New Field.”

I add the “Field title” and “Custom field key”:

I set the “Field type” to Number.

I also set the slug for this field.

I then go to the “More Taxonomies” settings menu to add a new product classification.

I click “Add New Taxonomy,” and fill in taxonomy names, allow permalinks, add slug, and show tag cloud.

I also specify that this taxonomy is only used for the “Product” type.

(Save!)

Now, the rest is more directly taken from the Blancer tutorial. But instead of copy-paste, I added the files directly to a Twenty Ten child theme. The files are available in this archive.

Here’s the style.css code:

/*
Theme Name: Product Directory
Theme URI: http://enkerli.com/
Description: A product directory child theme based on Kyle Jones, Blancer, and Twenty Ten
Author: Alexandre Enkerli
Version: 0.1
Template: twentyten
*/

@import url("../twentyten/style.css");

The code for functions.php:

<!--?php /**  * ProductDir functions and definitions  * @package WordPress  * @subpackage Product_Directory  * @since Product Directory 0.1  */ /*Custom Columns*/ add_filter("manage_edit-product_columns", "prod_edit_columns"); add_action("manage_posts_custom_column",  "prod_custom_columns"); function prod_edit_columns($columns){ 		$columns = array( 			"cb" =--> "<input type="\&quot;checkbox\&quot;" />",
			"title" => "Product Title",
			"description" => "Description",
			"price" => "Price",
			"catalog" => "Catalog",
		);

		return $columns;
}

function prod_custom_columns($column){
		global $post;
		switch ($column)
		{
			case "description":
				the_excerpt();
				break;
			case "price":
				$custom = get_post_custom();
				echo $custom["price"][0];
				break;
			case "catalog":
				echo get_the_term_list($post->ID, 'catalog', '', ', ','');
				break;
		}
}
?>

And the code in single-product.php:

<!--?php /**  * Template Name: Product - Single  * The Template for displaying all single products.  *  * @package WordPress  * @subpackage Product_Dir  * @since Product Directory 1.0  */ get_header(); ?-->
<div id="container">
<div id="content">
<!--?php the_post(); ?-->

<!--?php 	$custom = get_post_custom($post--->ID);
	$price = "$". $custom["price"][0];

?>
<div id="post-<?php the_ID(); ?><br />">>
<h1 class="entry-title"><!--?php the_title(); ?--> - <!--?=$price?--></h1>
<div class="entry-meta">
<div class="entry-content">
<div style="width: 30%; float: left;">
			<!--?php the_post_thumbnail( array(100,100) ); ?-->
			<!--?php the_content(); ?--></div>
<div style="width: 10%; float: right;">
			Price
<!--?=$price?--></div>
</div>
</div>
</div>
<!-- #content --></div>
<!-- #container -->

<!--?php get_footer(); ?-->

That’s it!

Well, almost..

One thing is that I have to activate my new child theme.

So, I go to the “Themes” Super Admin menu and enable the Product Directory theme (this step isn’t needed with single-site WordPress).

I then activate the theme in Appearance->Themes (in my case, on the second page).

One thing I’ve learnt the hard way is that the permalink structure may not work if I don’t go and “nudge it.” So I go to the “Permalinks” Settings menu:

And I click on “Save Changes” without changing anything. (I know, it’s counterintuitive. And it’s even possible that it could work without this step. But I spent enough time scratching my head about this one that I find it important.)

Now, I’m done. I can create new product posts by clicking on the “Add New” Products menu.

I can then fill in the product details, using the main WYSIWYG box as a description, the “price” field as a price, the “featured image” as the product image, and a taxonomy as a classification (by clicking “Add new” for any tag I want to add, and choosing a parent for some of them).

Now, in the product management interface (available in Products->Products), I can see the proper columns.

Here’s what the product page looks like:

And I’ve accomplished my mission.

The whole process can be achieved rather quickly, once you know what you’re doing. As I’ve been told (by the ever-so-helpful Justin Tadlock of Theme Hybrid fame, among other things), it’s important to get the data down first. While I agree with the statement and its implications, I needed to understand how to build these things from start to finish.

In fact, getting the data right is made relatively easy by my background as an ethnographer with a strong interest in cognitive anthropology, ethnosemantics, folk taxonomies (aka “folksonomies“), ethnography of communication, and ethnoscience. In other words, “getting the data” is part of my expertise.

The more technical aspects, however, were a bit difficult. I understood most of the principles and I could trace several puzzle pieces, but there’s a fair deal I didn’t know or hadn’t done myself. Putting together bits and pieces from diverse tutorials and posts didn’t work so well because it wasn’t always clear what went where or what had to remain unchanged in the code. I struggled with many details such as the fact that Kyle Jones’s code for custom columns wasn’t working first because it was incorrectly copied, then because I was using it on a post type which was “officially” based on pages (instead of posts). Having forgotten the part about “touching” the Permalinks settings, I was unable to get a satisfying output using Jones’s explanations (the fact that he doesn’t use titles didn’t really help me, in this specific case). So it was much harder for me to figure out how to do this than it now is for me to build content directories.

I still have some technical issues to face. Some which are near essential, such as a way to create archive templates for custom post types. Other issues have to do with features I’d like my content directories to have, such as clearly defined roles (the “More Plugins” support roles, but I still need to find out how to define them in WordPress). Yet other issues are likely to come up as I start building content directories, install them in specific contexts, teach people how to use them, observe how they’re being used and, most importantly, get feedback about their use.

But I’m past a certain point in my self-learning journey. I’ve built my confidence (an important but often dismissed component of gaining expertise and experience). I found proper resources. I understood what components were minimally necessary or required. I succeeded in implementing the system and testing it. And I’ve written enough about the whole process that things are even clearer for me.

And, who knows, I may get feedback, questions, or advice..


Minds of All Sizes Think Alike

Or «les esprits de toutes tailles se rencontrent».

This post is a response to the following post about Social Network Analysis (SNA), social change, and communication.

…My heart’s in Accra » Shortcuts in the social graph.

I have too many disparate things to say about that post to make it into a neat and tidy “quickie,” yet I feel like I should probably be working on other things. So we’ll see how this goes.

First, a bit of context..

[This “bit of context” may be a bit long so, please bear with me. Or you could get straight to the point, if you don’t think you can bear the context bit.]

I’ve never met Ethan Zuckerman (@EthanZ), who wrote the post to which I’m responding. And I don’t think we’ve had any extended conversation in the past. Further, I doubt that I’m on his radar. He’s probably seen my name, since I’ve commented on some of his posts and some of his contacts may have had references to me through social media. But I very much doubt that he’s ever mentioned me to anyone. I’m not noticeable to him.

I, on the other hand, have mentioned Zuckerman on several occasions. Latest time I remember was in class, a few weeks ago. It’s a course on Africa and I was giving students a list of online sources with relevance to our work. Zuckerman’s connection to Africa may not be his main thing, despite his blog’s name, but it’s part of the reason I got interested in his work, a few years ago.

In fact, there’s something embarrassing, here.. I so associate Zuckerman to Africa that my mind can’t help but link him to Erik Hersman, aka White African. I did meet Herman. [To be exact, I met Erik at BarCampAustin, which is quite possibly the conference-like event which has had the most influence on me, in the past few years (I go to a lot of these events).] When I did meet Hersman, I made a faux-pas in associating him with Zuckerman. Good-natured as he seemed to be, Hersman smiled as he corrected me.

EthanZ and I have other contacts in common. Jeremy Clarke, for instance, who co-organizes WordCamp Montreal and has been quite active in Montreal’s geek scene. Jeremy’s also a developer for Global Voices, a blogging community that Zuckerman co-founded. I’m assuming Clarke and Zuckerman know each other.

Another mutual contact is Christopher Lydon, host of Radio Open Source. Chris and I have exchanged a few emails, and Zuckerman has been on ROS on a few occasions.

According to Facebook, Zuckerman and I have four contacts in common. Apart from Clarke and Hersman, there’s P. Kerim Friedman and Gerd Leonhard. Kerim is a fellow linguistic anthropologist and we’ve collaborated on the official Society for Linguistic Anthropology (SLA) site. I got in touch with Leonhard through “Music 2.0” issues, as he was interviewed by Charles McEnerney on Well-Rounded Radio.

On LinkedIn, Zuckerman is part of my third degree, with McEnerney as one of my first-degree contacts who could connect me to Zuckerman, through Zuckerman’s contacts.

(Yes, I’m fully aware of the fact that I haven’t name a single woman in this list. Nor someone who doesn’t write in English with some frequency, for that matter.)

By this time, my guess is that you may be either annoyed or confused. “Surely, he can’t be that obsessed with Zuckerman as to stalk him in every network.”

No, I’m not at all obsessed with Ethan Zuckerman in any way, shape, or form. Though I mention him on occasion and I might have a good conversation with him if the occasion arises, I wouldn’t go hang out in Cambridge just in case I might meet him. Though I certainly respect his work, I wouldn’t treat him as my “idol” or anything like that. In other words, he isn’t a focus in my life.

And that’s a key point, to me.

In certain contexts, when social networks are discussed, too much is made of the importance of individuals. Yet, there’s something to be said about relative importance.

In his “shortcuts” post, Zuckerman talks about a special kind of individuals. Those who are able to bypass something of a clustering effect happening in many human networks. Malcolm Gladwell (probably “inspired” by somebody else) has used “connectors” to label a fairly similar category of people and, given Gladwell’s notoriety in some circles, the name has resonance in some contexts (mostly “business-focused people,” I would say, with a clear idea in my mind of the groupthink worldview implied).

In one of my earliest blogposts, I talked about an effect happening through a similar mechanism, calling it the “Social Butterfly Effect” (SBE). I still like it, as a concept. Now, I admit that it focuses on a certain type of individuals. But it’s more about their position in “the grand scheme of things” than about who they are, though I do associate myself with this “type.”

The basic idea is quite simple. People who participate in different (sub)networks, who make such (sub)networks sparser, are having unpredictable and unmeasurable effects on what is transmitted through the network(s).

On one hand, it’s linked to my fragmentary/naïve understanding of the Butterfly Effect in the study of climate and as a component of Chaos Theory.

On the other hand, it’s related to Granovetter‘s well-known notion of “weak ties.” And it seems like Granovetter is making something of a comeback, as we discuss different mechanisms behind social change.

Interestingly, much of what is being said about weak ties, these past few weeks, relates to Gladwell’s flamebait apparent lack of insight in describing current social processes. Sounds like Gladwell may be too caught up in the importance of individuals to truly grok the power of networks.

Case in point.. One of the most useful pieces I’ve read about weak ties, recently, was Jonah Lehrer‘s direct response to Gladwell:

Weak Ties, Twitter and Revolution | Wired Science | Wired.com.

Reading Lehrer’s piece, one gets the clear impression that Gladwell hadn’t “done his homework” on Granovetter before launching his trolling “controversial” piece on activism.

But I digress. Slightly.

Like the Gladwell-specific coverage, Zuckerman’s blogpost is also about social change and he’s already responded to Gladwell. One way to put it is that, as a figure, Gladwell has shaped the discussion in a way similar to a magnetic field orienting iron filings around it. Since it’s a localized effect having to do with polarization, the analogy is fairly useful, as analogies go.

Which brings me to groupthink, the apparent target of Zuckerman’s piece.

Still haven’t read Irving Janis but I’ve been quite interested in groupthink for a while. Awareness of the concept is something I immediately recognize, praise, and associate with critical thinking.

In fact, it’s one of several things I was pleasantly surprised to find in an introductory sociology WikiBook I ended up using in my  “Intro. to Society” course, last year. Critical thinking was the main theme of that course, and this short section was quite fitting in the overall discussion.

So, what of groupthink and networks? Zuckerman sounds worried:

This is interesting to me because I’m intrigued – and worried – by information flows through social networks. If we’re getting more (not lots yet, but more) information through social networks and less through curated media like newspapers, do we run the risk of encountering only information that our friends have access to? Are we likely to be overinformed about some conversations and underinformed about others? And could this isolation lead to ideological polarization, as Cass Sunstein and others suggest? And if those fears are true, is there anything we can do to rewire social networks so that we’re getting richer, more diverse information?

Similar questions have animated many discussions in media-focused circles, especially in those contexts where the relative value (and meaning) of “old vs. new media” may be debated. At about the same time as I started blogging, I remember discussing things with a statistician friend about the polarization effect of media, strong confirmation bias in reading news stories, and political lateralization.

In the United States, especially, there’s a narrative (heard loud and clear) that people who disagree on some basic ideas are unable to hear one another. “Shockingly,” some say, “conservatives and liberals read different things.” Or “those on (the) two sides of (the) debate understand things in completely different ways.” It even reminds me of the connotations of Tannen’s booktitle, You Just Don’t Understand. Irreconciliable differences. (And the first time I mention a woman in this decidedly imbalanced post.)

While, as a French-Canadian ethnographer, my perspective is quite different from Zuckerman, I can’t help but sympathize with the feeling. Not that I associate groupthink with a risk in social media (au contraire!). But, like Zuckerman, I wish to find ways to move beyond these boundaries we impose on ourselves.

Zuckerman specifically discusses the attempt by Onnik Krikorian (@OneWMPhoto) to connect Armenians (at least those in Hayastan) and Azeris, with Facebook “affording” Krikorian some measure of success. This case is now well-known in media-centric circles and it has almost become shorthand for the power of social media. Given a personal interest in Armenians (at least in the Diaspora), my reaction to Krikorian’s success are less related to the media aspect than to the personal one.

At a personal level, boundaries may seem difficult to surmount but they can also be fairly porous and even blurry. Identity may be negotiated. Individuals crossing boundaries may be perceived in diverse ways, some of which have little to do with other people crossing the same boundaries. Things are lived directly, from friendships to wars, from breakups to reconciliations. Significant events happen regardless of the way  they’re being perceived across boundaries.

Not that boundaries don’t matter but they don’t necessarily circumscribe what happens in “personal lives.” To use an seemingly-arbitrary example, code-switching doesn’t “feel” strange at an individual level. It’s only when people insist on separating languages using fairly artificial criteria that alternance between them sounds awkward.

In other words, people cross boundaries all the time and “there’s nothing to it.”

Boundaries have quite a different aspect, at the macrolevel implied by the journalistic worldview (with nation-based checkbox democracy at its core and business-savvy professionalization as its mission). To “macros” like journos and politicos, boundaries look like borders, appearing clearly on maps (including mind ones) and implying important disconnects. The border between Armenia and Azerbaijan is a boundary separating two groups and the conflicts between these two groups reify that boundary. Reaching out across the border is a diplomatic process and necessitates finding the right individuals for the task. Most of the important statuses are ascribed, which may sound horrible to some holding neoliberal ideas about freewill and “individual freedoms.”

Though it’s quite common for networked activities to be somewhat constrained by boundaries, a key feature of networks is that they’re typically boundless. Sure, there are networks which are artificially isolated from the rest. The main example I can find is that of a computer virology laboratory.

Because, technically, you only need one link between two networks to transform them into a single network. So, it’s quite possible to perceive Verizon’s wireless network as a distinct entity, limited by the national boundaries of the U.S. of A. But the simple fact that someone can use Verizon’s network to contact someone in Ségou shows that the network isn’t isolated. Simple, but important to point out.

Especially since we’re talking about a number of things happening on a single network: The Internet. (Yes, there is such a thing as Internet2 and there are some technical distinctions at stake. But we’re still talking about an interconnected world.)

As is well-known, there are significant clusters in this One Network. McLuhan’s once-popular “Global Village” fallacy used to hide this, but we now fully realize that language barriers, national borders, and political lateralization go with “low-bandwidth communication,” in some spots of The Network. “Gs don’t talk to Cs so even though they’re part of the same network, there’s a weak spot, there.” In a Shannon/Weaver view, it sounds quite important to identify these weak spots. “Africa is only connected to North America via a few lines so access is limited, making things difficult for Africans.” Makes sense.

But going back to weak ties, connectors, Zuckerman’s shortcuts, and my own social butterflies, the picture may be a little bit more fleshed out.

Actually, the image I have in mind has, on one side, a wire mesh serving as the floor of an anechoic chamber  and on the other some laser beams going in pseudorandom directions as in Entrapment or Mission Impossible. In the wire mesh, weaker spots might cause a person to fall through and land on those artificial stalagmites. With the laser beams, the pseudorandom structure makes it more difficult to “find a path through the maze.” Though some (engineers) may see the mesh as the ideal structure for any network, there’s something humanly fascinating about the pseudorandom structure of social networks.

Obviously, I have many other ideas in mind. For instance, I wanted to mention “Isabel Wilkerson’s Leaderless March that Remade America.” Or go back to that intro soci Wikibook to talk about some very simple and well-understood ideas about social movements, which often seem to be lacking in discussions of social change. I even wanted to recount some anecdotes of neat network effects in my own life, such as the serendipity coming from discuss disparate subjects to unlike people or the misleading impression that measuring individualized influence is a way to understand social media. Not to mention a whole part I had in my mind about Actor Network Theory, non-human actors, and material culture (the other course I currently teach).

But I feel like going back to more time-sensitive things.

Still, I should probably say a few words about this post’s title.

My mother and I were discussing parallel inventions and polygenesis with the specific theme of moving away from the focus on individualized credit. My favourite example, and one I wish Gladwell (!) had used in Outliers (I actually asked him about it) is that of Gregor Mendel and the “rediscovery” of his laws by de Vries, Correns, and Tschermak. A semi-Marxian version of the synchronous polygenesis part might hold that “ideas are in the air” or that the timing of such dicoveries and inventions has to do with zeitgeist. A neoliberal version could be the “great minds think alike” expression or its French equivalent «Les grands esprits se rencontrent» (“The great spirits meet each other”). Due to my reluctance in sizing up minds, I’d have a hard time using that as a title. In the past, I used a similar title to refer to another form of serendipity:

To me, most normally constituted minds are “great,” so I still could have used the expression as a title. But an advantage of tweaking an expression is that it brings attention to what it implies.

In this case, the “thinking alike” may be a form of groupthink.

 


Getting Started with Sorbet, Yogurt, Juice

Got a sorbet maker yesterday and I made my first batch today.

Had sent a message on Chowhounrd:

Store-Bought Simple Syrup in Sorbet? – Home Cooking – Chowhound.

Received some useful answers but didn’t notice them until after I made my sorbet.

Here’s my follow-up:

Hadn’t noticed replies were added (thought I’d receive notifications). Thanks a lot for all the useful advice!

And… it did work.

My lemon-ginger sorbet was a bit soft on its way out of the machine, but the flavour profile is exactly what I wanted.

I used almost a liter of this store-bought syrup with more than a half-liter of a ginger-lemon concoction I made (lemon juice and food-processed peeled ginger). All of this blended together. The resulting liquid was more than the 1.5 quart my sorbet-maker can withstand so I reserved a portion to mix with a syrup made from ginger peel infused in a brown sugar and water solution. I also did a simple sugar to which I added a good quantity of lemon zest. These two syrups I pressure-cooked and will use in later batches.

Judging the amount of sugar may be tricky but, in this case, I decided to go by taste. It’s not sweet enough according to some who tried it but it’s exactly what I wanted. Having this simple syrup on hand (chilled) was quite helpful, as I could adjust directly by adding syrup to the mix.

One thing is for sure, I’ll be doing an apple-ginger sorbet soon. The ginger syrup I made just cries out “apple sauce sorbet.” Especially the solids (which I didn’t keep in the syrup). I might even add some homemade hard cider that I like.

As for consistency, it’s not even a problem but I get the impression that the sorbet will get firmer as it spends time in the freezer. It’s been there for almost two hours already and I should be able to leave it there for another two hours before I bring it out (actually, traveling with it). The machine’s book mentions two hours in the freezer for a firmer consistency and I’ve seen several mentions of “ripening” so it sounds like it’d make sense to do this.

Also, the lemon-ginger mixture I used wasn’t chilled, prior to use. It may have had an impact on the firmness, I guess…

As a first attempt at sorbet-making, it’s quite convincing. I’ve had a few food-related hobbies, in the recent past, and sorbet-making might easily take some space among them, especially if results are this satisfying without effort. I was homebrewing beer until recently (and will probably try beer sorbets, as I’ve tasted some nice ones made by friends and I have a lot of leftover beer from the time I was still brewing). By comparison to homebrewing, sorbet-making seems to be a (proverbial) “piece of cake.”

Dunno if such a long tirade violates any Chow forum rule but I just wanted to share my first experience.

Thanks again for all your help!

Alex

Among links given in this short thread was the sorbet section of the French Cooking guide on About.com. I’ve already had good experiences with About’s BBQ Guide. So my preliminary impression of these sorbet recipes is rather positive. And, in fact, they’re quite inspiring.

A sampling:

  • Apple and Calvados
  • Beet
  • Cardamom/Pear
  • Pomegranate/Cranberry
  • Spiced Apple Cider

It’s very clear that, with sorbet, the only limit is your imagination. I’ll certainly make some savoury sorbets, including this spiced tomato one. Got a fairly large number of ideas for interesting combinations. But, perhaps unsurprisingly, it seems that pretty much everything has been tried. Hibiscus flowers, hard cider, horchata, teas…

And using premade syrups is probably a good strategy, especially if mixed with fresh purées, possibly made with frozen fruit. Being able to adjust sweetness is a nice advantage. Although, one comment in that Chowhound thread mentioned a kind of “homemade hydrometer” technique (clean egg floating in the liquid…), the notion apparently being that the quantity of sugar is important in and of itself (for texture and such) and you can adjust flavour with other ingredients, including acids.

One reason I like sorbets so much is that I’m lactose intolerant. More specifically: I discovered fairly recently that I was lactose intolerant. So I’m not completely weaned from ice cream. It’s not the ideal time to start making sorbet as the weather isn’t that warm, at this point. But I’ve never had an objection to sorbet in cold weather.

As happened with other hobbies, I’ve been having some rather crazy ideas. And chances are that this won’t be my last sorbet making machine.

Nor will it only be a sorbet machine. While I have no desire to make ice cream in it, it’s already planned as a “froyo” maker and a “frozen soy-based dessert” maker. In some cases, I actually prefered frozen yogurt and frozen tofu to ice cream (maybe because my body was telling me to avoid lactose). And I’m getting a yogurt maker soon, which will be involved in all sorts of yogurt-based experiments from “yogurt cheese” (lebneh) to soy-milk “yogurt” and even whey-enhanced food (from the byproduct from lebneh). So, surely, frozen yogurt will be involved.

And I didn’t mention my juicer, here (though I did mention it elsewhere). Not too long ago, I was using a juicer on a semi-regular basis and remember how nice the results were, sometimes using unlikely combinations (cucumber/pineapple being a friend’s favourite which was relatively convincing). A juicer will also be useful in preparing sorbets, I would guess. Sure, it’s probably a good idea to have a thicker base than juice for a firm sorbet, but I might actually add banana, goyava, or fig to some sorbets. Besides, the solids left behind by the juice extraction can be made into interesting things too and possibly added back to the sorbet base. I can easily imagine how it’d work with apples and some vegetables.

An advantage of all of this is that it’ll directly increase the quantity of fruits and vegetables I consume. Juices are satisfying and can be made into soups (which I also like). Yogurt itself I find quite appropriate in my diet. And there surely are ways to have low-sugar sorbet. These are all things I enjoy on their own. And they’re all extremely easy to make (I’ve already made yogurt and juice, so I don’t foresee any big surprise). And they all fit in a lactose-free (or, at least, low-lactose) diet.

Food is fun.


Academics and Their Publics

Misunderstood by Raffi Asdourian

Misunderstood by Raffi Asdourian

Academics are misunderstood.

Almost by definition.

Pretty much any academic eventually feels that s/he is misunderstood. Misunderstandings about some core notions in about any academic field are involved in some of the most common pet peeves among academics.

In other words, there’s nothing as transdisciplinary as misunderstanding.

It can happen in the close proximity of a given department (“colleagues in my department misunderstand my work”). It can happen through disciplinary boundaries (“people in that field have always misunderstood our field”). And, it can happen generally: “Nobody gets us.”

It’s not paranoia and it’s probably not self-victimization. But there almost seems to be a form of “onedownmanship” at stake with academics from different disciplines claiming that they’re more misunderstood than others. In fact, I personally get the feeling that ethnographers are more among the most misunderstood people around, but even short discussions with friends in other fields (including mathematics) have helped me get the idea that, basically, we’re all misunderstood at the same “level” but there are variations in the ways we’re misunderstood. For instance, anthropologists in general are mistaken for what they aren’t based on partial understanding by the general population.

An example from my own experience, related to my decision to call myself an “informal ethnographer.” When you tell people you’re an anthropologist, they form an image in their minds which is very likely to be inaccurate. But they do typically have an image in their minds. On the other hand, very few people have any idea about what “ethnography” means, so they’re less likely to form an opinion of what you do from prior knowledge. They may puzzle over the term and try to take a guess as to what “ethnographer” might mean but, in my experience, calling myself an “ethnographer” has been a more efficient way to be understood than calling myself an “anthropologist.”

This may all sound like nitpicking but, from the inside, it’s quite impactful. Linguists are frequently asked about the number of languages they speak. Mathematicians are taken to be number freaks. Psychologists are perceived through the filters of “pop psych.” There are many stereotypes associated with engineers. Etc.

These misunderstandings have an impact on anyone’s work. Not only can it be demoralizing and can it impact one’s sense of self-worth, but it can influence funding decisions as well as the use of research results. These misunderstandings can underminine learning across disciplines. In survey courses, basic misunderstandings can make things very difficult for everyone. At a rather basic level, academics fight misunderstandings more than they fight ignorance.

The  main reason I’m discussing this is that I’ve been given several occasions to think about the interface between the Ivory Tower and the rest of the world. It’s been a major theme in my blogposts about intellectuals, especially the ones in French. Two years ago, for instance, I wrote a post in French about popularizers. A bit more recently, I’ve been blogging about specific instances of misunderstandings associated with popularizers, including Malcolm Gladwell’s approach to expertise. Last year, I did a podcast episode about ethnography and the Ivory Tower. And, just within the past few weeks, I’ve been reading a few things which all seem to me to connect with this same issue: common misunderstandings about academic work. The connections are my own, and may not be so obvious to anyone else. But they’re part of my motivations to blog about this important issue.

In no particular order:

But, of course, I think about many other things. Including (again, in no particular order):

One discussion I remember, which seems to fit, included comments about Germaine Dieterlen by a friend who also did research in West Africa. Can’t remember the specifics but the gist of my friend’s comment was that “you get to respect work by the likes of Germaine Dieterlen once you start doing field research in the region.” In my academic background, appreciation of Germaine Dieterlen’s may not be unconditional, but it doesn’t necessarily rely on extensive work in the field. In other words, while some parts of Dieterlen’s work may be controversial and it’s extremely likely that she “got a lot of things wrong,” her work seems to be taken seriously by several French-speaking africanists I’ve met. And not only do I respect everyone but I would likely praise someone who was able to work in the field for so long. She’s not my heroine (I don’t really have heroes) or my role-model, but it wouldn’t have occurred to me that respect for her wasn’t widespread. If it had seemed that Dieterlen’s work had been misunderstood, my reflex would possibly have been to rehabilitate her.

In fact, there’s  a strong academic tradition of rehabilitating deceased scholars. The first example which comes to mind is a series of articles (PDF, in French) and book chapters by UWO linguistic anthropologist Regna Darnell.about “Benjamin Lee Whorf as a key figure in linguistic anthropology.” Of course, saying that these texts by Darnell constitute a rehabilitation of Whorf reveals a type of evaluation of her work. But that evaluation comes from a third person, not from me. The likely reason for this case coming up to my mind is that the so-called “Sapir-Whorf Hypothesis” is among the most misunderstood notions from linguistic anthropology. Moreover, both Whorf and Sapir are frequently misunderstood, which can make matters difficulty for many linguistic anthropologists talking with people outside the discipline.

The opposite process is also common: the “slaughtering” of “sacred cows.” (First heard about sacred cows through an article by ethnomusicologist Marcia Herndon.) In some significant ways, any scholar (alive or not) can be the object of not only critiques and criticisms but a kind of off-handed dismissal. Though this often happens within an academic context, the effects are especially lasting outside of academia. In other words, any scholar’s name is likely to be “sullied,” at one point or another. Typically, there seems to be a correlation between the popularity of a scholar and the likelihood of her/his reputation being significantly tarnished at some point in time. While there may still be people who treat Darwin, Freud, Nietzsche, Socrates, Einstein, or Rousseau as near divinities, there are people who will avoid any discussion about anything they’ve done or said. One way to put it is that they’re all misunderstood. Another way to put it is that their main insights have seeped through “common knowledge” but that their individual reputations have decreased.

Perhaps the most difficult case to discuss is that of Marx (Karl, not Harpo). Textbooks in introductory sociology typically have him as a key figure in the discipline and it seems clear that his insight on social issues was fundamental in social sciences. But, outside of some key academic contexts, his name is associated with a large series of social events about which people tend to have rather negative reactions. Even more so than for Paul de Man or  Martin Heidegger, Marx’s work is entangled in public opinion about his ideas. Haven’t checked for examples but I’m quite sure that Marx’s work is banned in a number of academic contexts. However, even some of Marx’s most ardent opponents are likely to agree with several aspects of Marx’s work and it’s sometimes funny how Marxian some anti-Marxists may be.

But I digress…

Typically, the “slaughtering of sacred cows” relates to disciplinary boundaries instead of social ones. At least, there’s a significant difference between your discipline’s own “sacred cows” and what you perceive another discipline’s “sacred cows” to be. Within a discipline, the process of dismissing a prior scholar’s work is almost œdipean (speaking of Freud). But dismissal of another discipline’s key figures is tantamount to a rejection of that other discipline. It’s one thing for a physicist to show that Newton was an alchemist. It’d be another thing entirely for a social scientist to deconstruct James Watson’s comments about race or for a theologian to argue with Darwin. Though discussions may have to do with individuals, the effects of the latter can widen gaps between scholarly disciplines.

And speaking of disciplinarity, there’s a whole set of issues having to do with discussions “outside of someone’s area of expertise.” On one side, comments made by academics about issues outside of their individual areas of expertise can be very tricky and can occasionally contribute to core misunderstandings. The fear of “talking through one’s hat” is quite significant, in no small part because a scholar’s prestige and esteem may greatly decrease as a result of some blatantly inaccurate statements (although some award-winning scholars seem not to be overly impacted by such issues).

On the other side, scholars who have to impart expert knowledge to people outside of their discipline  often have to “water down” or “boil down” their ideas and, in effect, oversimplifying these issues and concepts. Partly because of status (prestige and esteem), lowering standards is also very tricky. In some ways, this second situation may be more interesting. And it seems unavoidable.

How can you prevent misunderstandings when people may not have the necessary background to understand what you’re saying?

This question may reveal a rather specific attitude: “it’s their fault if they don’t understand.” Such an attitude may even be widespread. Seems to me, it’s not rare to hear someone gloating about other people “getting it wrong,” with the suggestion that “we got it right.”  As part of negotiations surrounding expert status, such an attitude could even be a pretty rational approach. If you’re trying to position yourself as an expert and don’t suffer from an “impostor syndrome,” you can easily get the impression that non-specialists have it all wrong and that only experts like you can get to the truth. Yes, I’m being somewhat sarcastic and caricatural, here. Academics aren’t frequently that dismissive of other people’s difficulties understanding what seem like simple concepts. But, in the gap between academics and the general population a special type of intellectual snobbery can sometimes be found.

Obviously, I have a lot more to say about misunderstood academics. For instance, I wanted to address specific issues related to each of the links above. I also had pet peeves about widespread use of concepts and issues like “communities” and “Eskimo words for snow” about which I sometimes need to vent. And I originally wanted this post to be about “cultural awareness,” which ends up being a core aspect of my work. I even had what I might consider a “neat” bit about public opinion. Not to mention my whole discussion of academic obfuscation (remind me about “we-ness and distinction”).

But this is probably long enough and the timing is right for me to do something else.

I’ll end with an unverified anecdote that I like. This anecdote speaks to snobbery toward academics.

[It’s one of those anecdotes which was mentioned in a course I took a long time ago. Even if it’s completely fallacious, it’s still inspiring, like a tale, cautionary or otherwise.]

As the story goes (at least, what I remember of it), some ethnographers had been doing fieldwork  in an Australian cultural context and were focusing their research on a complex kinship system known in this context. Through collaboration with “key informants,” the ethnographers eventually succeeded in understanding some key aspects of this kinship system.

As should be expected, these kinship-focused ethnographers wrote accounts of this kinship system at the end of their field research and became known as specialists of this system.

After a while, the fieldworkers went back to the field and met with the same people who had described this kinship system during the initial field trip. Through these discussions with their “key informants,” the ethnographers end up hearing about a radically different kinship system from the one about which they had learnt, written, and taught.

The local informants then told the ethnographers: “We would have told you earlier about this but we didn’t think you were able to understand it.”


Moving On

[I’m typically not very good at going back to drafts and I don’t have much time to write this. But I can RERO this. It’s an iterative process in any case….]

Been thinking about different things which all relate to the same theme: changing course, seizing opportunities, shifting focus, adapting to new situations, starting over, getting a clean slate… Moving on.

One reason is that I recently decided to end my ethnography podcast. Not that major a decision and rather easy to make. Basically, I had stopped doing it but I had yet to officially end it. I had to make it clear, in my mind, that it’s not part of the things I’m doing, these days. Not that it was a big thing in my life but I had set reminders every month that I had to record a podcast episode. It worked for ten episode (in ten months) but, once I had missed one episode, the reminder was nagging me more than anything else.

In this sense, “moving on” is realistic/pragmatic. Found something similar in Getting Things Done, by David Allen.

It’s also similar to something Larry Lessig called “email bankruptcy,” as a step toward enhanced productivity.

In fact, even financial bankruptcy can relate to this, in some contexts. In Canada, at least, bankruptcy is most adequately described as a solution to a problem, not the problem itself. I’ve known some people who were able to completely rebuild their finances after declaring bankruptcy, sometimes even getting a better credit rating than someone who hadn’t gone bankrupt. I know how strongly some people may react to this concept of bankruptcy (based on principle, resentment, fears, hopes…). It’s an extreme example of what I mean by “moving on.” It goes well with the notion, quite common in North American cultural contexts, that you always deserve a second chance (but that you should do things yourself).

Of course, similar things happen with divorces which, similarly, can often be considered as solutions to a problem rather than the problem itself. No matter how difficult or how bad divorce might be, it’s a way to start over. In some sense, it’s less extreme an example as the bankruptcy one. But it may still generate negative vibes or stir negative emotions.

Because what I’m thinking about has more to do with “turning over a new leaf.” And taking the “leap of faith” which will make you go where you feel more comfortable. I’m especially thinking about all sorts of cases of people who decided to make radical changes in their professional or personal lives, often leaving a lot behind. Whether they were forced to implement such changes or decided to jump because they simply wanted to, all of the cases I remember have had positive outcomes.

It reminds me of a good friend of mine with whom I went through music school, in college. When he finished college, he decided to follow the music path and registered for the conservatory. But, pretty quickly, he realized that it wasn’t for him. Even though he had been intensely “in music” for several years, with days of entering the conservatory, he saw that music wasn’t to remain the central focus of his career. Through a conversation with a high school friend (who later became his wife and the mother of his children), he found out that it wasn’t too late for him to register for university courses. He had been thinking about phys. ed., and thought it might be a nice opportunity to try that path. He’s been a phys. ed. teacher for a number of years. We had lunch together last year and he seems very happy with his career choice. He also sounds like a very dedicated and effective phys. ed. teacher.

In my last podcast episode, I mentioned a few things about my views of this “change of course.” Including what has become something of an expression, for me: “Done with fish.” Comes from the movie Adaptation. The quote is found here (preceded by a bit of profanity). Basically, John Laroche, who was passionately dedicated to fish, decided to completely avoid anything having to do with fish. I can relate to this at some rather deep level.

I’m also thinking about the negative consequences of “sticking with” something which isn’t working, shifting too late or too quickly, implementing changes in inappropriate ways. Plenty of examples there. Most of the ones which come to my mind have to do with business settings. One which would require quite a bit of “explaining” is my perception of Google’s strategy with Wave. Put briefly (with the hope of revisiting this issue), I think Google made bad decisions with Wave, including killing it both too late and too early (no, I don’t see this as a contradiction; but I don’t have time to explain it). They also, I feel, botched a few transitions, in this. And, more importantly, I’d say that they failed to adapt the product to what was needed.

And the trigger for several of my reflections on this “moving on” idea have to do with this kind of adaptation (fun that the movie of that name should be involved, eh?). Twitter could be an inspiration, in this case. Not only did they, like Flickr, start through a switch away from another project, but Twitter was able to transform users’ habits into the basis for some key features. Hashtags and “@replies” are well-known examples. But you could even say that most of the things they’ve been announcing have been related to the way people use their tools.

So, in a way, it’s about the balance between vision and responsiveness. Vision is often discussed and it sounds to some people as a key thing in any “task-based group (from a team to a corporation). But the way a team can switch from one project to the next based on feedback (from users or other stakeholders) seems underrated. Although, there is some talk about the “startup mentality” in many contexts, including Google and Apple. Words which fit this semantic field include: “agile,” “flexible,” “pivot,” “lean,” and “nimble” (the latter word seemed to increase in currency after being used by Barack Obama in a speech).

Anyhoo… Gotta go.

But, just before I go: I am moving on with some things (including my podfade but also a shift away from homebrewing). But the key things in my life are very stable, especially my sentimental life.


Values of Content

Wannabe Guru: “There’s no such thing as free content.”

Literature Major: “Content’s a tale / Told by an idiot, full of sound and fury, / Signifying nothing.”

Arts Major: “Content Is in the Eye of the Beholder.”

Entertainer: “There’s no content / like show content / like no content I know.”

Journalist: “Content is my job and I deserve to be paid for what I make, the exact same way that a baker is paid for selling bread. What other people called ‘content’ isn’t really content since it hasn’t been vetted by professionals like my editor. So my role is to create content so that my editor can distribute it through exclusive channels. Other people’s content becomes my content when I secure the rights to it through the use of a clearance service. Comments by people I interview only become content after they sign a release. Everything else is noise.”

Economist: “There are four ways to get paid for content: a) subscription; b) advertising; c) private or public sponsorship; d) sale on media. Since advertising and sponsorship are two aspects of the same model and since consumers epend money on either subscription or media sales, there are two basic models.”

Functionalist (Sociology): “Content serves different goals, both manifest and latent.”
Conflict-Theorist (Sociology): “Providing free content is a way for the ruling class to make the audience into a commodity.”

Interactionist (Sociology): “Content provides meaning to social selves.”

Moralist: “Content Yourself.”

Buddhist: “Content breeds contentment.”

Christian: “Content begat content.”

Geek: “Content Wants to be Free.”

Judge: “Our mission is to balance the rights of content creators with those of content consumers.”

Cop: “There are three forms of content: content that is appropriate for everyone, content which is only appropriate to certain people, and content which isn’t appropriate for anyone.”

Teenage Boy: “Where can I find naked pictures of that cute girl in my class?”

Teenage Girl: “How can I get in touch with that dreamy guy in that video?”


What Not to Tweet

Here’s a list I tweeted earlier.

Twenty Things You Should Never, Ever Tweet for Fear of Retaliation from the Tweet Police

  1. Lists. Too difficult to follow.
  2. Do’s and don’ts. Who died and made you bandleader?
  3. Personal thoughts. Nobody cares what anyone else thinks, anyway.
  4. Anything in a foreign language. It confuses everyone.
  5. Personal opinions. You may offend someone.
  6. Jokes. Same reason as #5.
  7. Links. Too dangerous, since some could be malicious.
  8. Anything in “the second degree.” The bareness of context prevents careful reading.
  9. Anything insightful. Who do you think you are?
  10. Personal replies. Can’t you get a room?
  11. -20: What @oatmeal said you shouldn’t tweet. If it’s funny, it must be true.

In case it wasn’t clear… Yes, I mean this as sarcasm. One of my pet peeves is to hear people tell others what to do or not to do, without appropriate context. It’s often perceived to be funny or useful but, to be honest, it just rubs me the wrong way. Sure, they’re allowed to do it. I won’t prevent them. I don’t even think they should stop, that’s really not for me to decide. It’s just that, being honest with myself, I realize how negative of an effect it has on me. It actually reaches waaaaay down into something I don’t care to visit very often.

The Oatmeal can be quite funny. Reading a few of these comics, recently, I literally LOLed. And this one probably pleased a lot of people, because it described some of their own pet peeves. Besides, it’s an old comic, probably coming from a time when tweets were really considered to be answers to the original Twitter prompt: “What are you doing?” (i.e., before the change to the somewhat more open “What’s happening?”). But I’ve heard enough expressions of what people should or shouldn’t do with a specific social media system that I felt the need to vent. So, that was the equivalent of a rant (and this post is closer to an actual rant).

I mean, there’s a huge difference between saying “these are the kinds of uses for which I think Twitter is the appropriate tool” and the flat-out dismissal of what others have done. While Twitter is old news, as social media go, it’s still unfolding and much of its strength comes from the fact that we don’t actually have a rigid notion of what it should be.

Not that there aren’t uses of Twitter I dislike. In fact, for much of 2009, I felt it was becoming too commercial for my taste. I felt there was too much promotion of commercial entities and products, and that it was relatively difficult to avoid such promotional tweets if one were to follow the reciprocation principle (“I really should make sure I follow those who follow me, even if a large proportion of them are just trying to increase their follower counts”). But none of this means that “Twitter isn’t for commercial promotion.” Structurally, Twitter almost seems to be made for such uses. Conceptually, it comes from the same “broadcast” view of communication, shared by many marketers, advertisers, PR experts, and movie producers. As social media tools go, Twitter is among the most appropriate ones to use to broadly distribute focused messages without having to build social relationships. So, no matter how annoyed I may get at these tweets and at commercial Twitterers, it’d be inaccurate to say that “Twitter isn’t for that.” Besides, “Twitter, Inc.” has adopted commercial promotion as a major part of its “business model.” No matter what one feels about this (say, that it’s not very creative or that it will help distinguish between commercial tweets and the rest of Twitter traffic), it seems to imply that Twitter is indeed about commercial promotion as much as it is about “shar[ing] and discover[ing] what’s happening now.”

The same couldn’t be said about other forms of tweeting that others may dislike. It’d be much harder to make a case for, say, conference liveblogging as being an essential part of what Twitter is about. In fact, some well-known and quite vocal people have made pronouncements about how inappropriate, in their minds, such a practice was. To me, much of it sounds like attempts at rationalizing a matter of individual preference. Some may dislike it but Twitter does make a very interesting platform for liveblogging conferences. Sure, we’ve heard about the negative consequences of the Twitter backchannel at some high-profile events. And there are some technical dimensions of Twitter which make liveblogging potentially more annoying, to some users, than if it were on another platform. But claiming that Twitter isn’t for liveblogging  reveals a rather rigid perspective of what social media can be. Again, one of the major strengths in Twitter is its flexibility. From “mentions” and “hashtags” to “retweets” and metadata, the platform has been developing over time based on usage patterns.

For one thing, it’s now much more conversational than it was in 2007, and some Twitter advocates are quite proud of that. So one might think that Twitter is for conversation. But, at least in my experience, Twitter isn’t that effective a tool for two-way communication let alone for conversations involving more than two people. So, if we’re to use conversation to evaluate Twitter (as its development may suggest we should do), it seems not to be that successful.

In this blog version of my list, I added a header with a mention of the “Tweet Police.” I mean it in the way that people talk about the “Fashion Police,” wish immediately makes me think about “fashion victims,” the beauty myth, the objectification of the human body, the social pressure to conform to some almost-arbitrary canons, the power struggles between those who decide what’s fashionable and those who need to dress fashionably to be accepted in some social contexts, etc. Basically, it leads to rather unpleasant thoughts. In a way, my mention of the “Tweet Police” is a strategy to “fight this demon” by showing how absurd it may become. Sure, it’d be a very tricky strategy if it were about getting everyone to just “get the message.” But, in this case, it’s about doing something which feels good. It’s my birthday, so I allow myself to do this.


Jazz and Identity: Comment on Lydon’s Iyer Interview

Radio Open Source » Blog Archive » Vijay Iyer’s Life in Music: “Striving is the Back Story…”.

Sounds like it will be a while before the United States becomes a truly post-racial society.

Iyer can define himself as American and he can even one-up other US citizens in Americanness, but he’s still defined by his having “a Brahmin Indian name and heritage, and a Yale degree in physics.”

Something by which I was taken aback, at IU Bloomington ten years ago, is the fact that those who were considered to be “of color” (as if colour were the factor!) were expected to mostly talk about their “race” whereas those who were considered “white” were expected to remain silent when notions of “race” and ethnicity came up for discussion. Granted, ethnicity and “race” were frequently discussed, so it was possible to hear the voices of those “of color” on a semi-regular basis. Still, part of my culture shock while living in the MidWest was the conspicuous silence of students with brilliant ideas who happened to be considered African-American.

Something similar happened with gender, on occasion, in that women were strongly encouraged to speak out…when a gender angle was needed. Thankfully, some of these women (at least, among those whose “racial” identity was perceived as neutral) did speak up, regardless of topic. But there was still an expectation that when they did, their perspective was intimately gendered.

Of course, some gender lines were blurred: the gender ratio among faculty members was relatively balanced (probably more women than men), the chair of the department was a woman for a time, and one department secretary was a man. But women’s behaviours were frequently interpreted in a gender-specific way, while men were often treated as almost genderless. Male privilege manifested itself in the fact that it was apparently difficult for women not to be gender-conscious.

Those of us who were “international students” had the possibility to decide when our identities were germane to the discussion. At least, I was able to push my «différence» when I so pleased, often by becoming the token Francophone in discussions about Francophone scholars, yet being able not to play the “Frenchie card” when I didn’t find it necessary. At the same time, my behaviour may have been deemed brash and a fellow student teased me by calling me “Mr. Snottyhead.” As an instructor later told me, “it’s just that, since you’re Canadian, we didn’t expect you to be so different.” (My response: “I know some Canadians who would despise that comment. But since I’m Québécois, it doesn’t matter.”) This was in reference to a seminar with twenty students, including seven “internationals”: one Zimbabwean, one Swiss-German, two Koreans, one Japanese, one Kenyan, and one “Québécois of Swiss heritage.” In this same graduate seminar, the instructor expected everyone to know of Johnny Appleseed and of John Denver.

Again, a culture shock. Especially for someone coming from a context in which the ethnic identity of the majority is frequently discussed and in which cultural identity is often “achieved” instead of being ascribed. This isn’t to say that Quebec society is devoid of similar issues. Everybody knows, Quebec has more than its fair share of identity-based problems. The fact of the matter is, Quebec society is entangled in all sorts of complex identity issues, and for many of those, Quebec may appear underprepared. The point is precisely that, in Quebec, identity politics is a matter for everyone. Nobody has the luxury to treat their identity as “neutral.”

Going back to Iyer… It’s remarkable that his thoughtful comments on Jazz end up associated more with his background than with his overall approach. As if what he had to say were of a different kind than those from Roy Hayes or Robin Kelley. As if Iyer had more in common with Koo Nimo than with, say, Sonny Rollins. Given Lydon’s journalistic background, it’s probably significant that the Iyer conversation carried the “Life in Music” name of  the show’s music biography series yet got “filed under” the show’s “Year of India” series. I kid you not.

And this is what we hear at the end of each episode’s intro:

This is Open Source, from the Watson Institute at Brown University. An American conversation with Global attitude, we call it.

Guess the “American” part was taken by Jazz itself, so Iyer was assigned the “Global” one. Kind of wishing the roles were reversed, though Iyer had rehearsed his part.

But enough symbolic interactionism. For now.

During Lydon’s interview with Iyer, I kept being reminded of a conversation (in Brookline)  with fellow Canadian-ethnomusicologist-and-Jazz-musician Tanya Kalmanovitch. Kalmanovitch had fantastic insight to share on identity politics at play through the international (yet not post-national) Jazz scene. In fact, methinks she’d make a great Open Source guest. She lives in Brooklyn but works as assistant chair of contemporary improv at NEC, in B-Town, so Lydon could probably meet her locally.

Anyhoo…

In some ways, Jazz is more racialized and ethnicized now than it was when Howie Becker published Outsiders. (hey, I did hint symbolic interactionism’d be back!). It’s also very national, gendered, compartmentalized… In a word: modern. Of course, Jazz (or something like it) shall play a role in postmodernity. But only if it sheds itself of its modernist trappings. We should hear out Kevin Mahogany’s (swung) comments about a popular misconception:

Some cats work from nine to five
Change their life for line of jive
Never had foresight to see
Where the changes had to be
Thought that they had heard the word
Thought it all died after Bird
But we’re still swingin’

The following anecdote seems à propos.

Branford Marsalis quartet on stage outside at the Indy Jazz Fest 1999. Some dude in the audience starts heckling the band: “Play something we know!” Marsalis, not losing his cool, engaged the heckler in a conversation on Jazz history, pushing the envelope, playing the way you want to play, and expected behaviour during shows. Though the audience sounded divided when Marsalis advised the heckler to go to Chaka Khan‘s show on the next stage over, if that was more to the heckler’s liking, there wasn’t a major shift in the crowd and, hopefully, most people understood how respectful Marsalis’s comments really were. What was especially precious is when Marsalis asked the heckler: “We’re cool, man?”

It’s nothing personal.


In Phase

Lissajous curve

Lissajous curve

Something which happens to me on a rather regular basis (and about which I blogged before) is that I’ll hear about something right after thinking about it. For instance, if I think about the fact that a given tool should exist, it may be announced right at that moment.

Hey, I was just thinking about this!

The effect is a bit strange but it’s quite easy to explain. It feels like a “premonition,” but it probably has more to do with “being in phase.” In some cases, it may also be that I heard about that something but hadn’t registered the information. I know it happens a lot and  it might not be too hard to trace back. But I prefer thinking about phase.

And, yes, I am thinking about phase difference in waves. Not in a very precise sense, but the image still works, for me. Especially with the Lissajous representation, as above.

See, I don’t particularly want to be “ahead of the curve” and I don’t particularly mind being “behind the curve.” But when I’m right “in the curve,” something interesting happens. I’m “in the now.”

I originally thought about being “in tune” and it could also be about “in sync” or even “matching impedances.” But I still like the waves analogy. Especially since, when two waves are in phase, they reinforce one another. As analogies go, it’s not only a beautiful one, but a powerful one. And, yes, I do think about my sweetheart.

One reason I like the concept of phase difference is that I think through sound. My first exposure to the concept comes from courses in musical acoustics, almost twenty years ago. It wasn’t the main thing I’d remember from the course and it’s not something I investigated at any point since. Like I keep telling students, some things hit you long after you’ve heard about it in a course. Lifelong learning and “landminds” are based on such elements, even tiny unimportant ones. Phase difference is one such thing.

And it’s no big deal, of course. It’s not like I spent days thinking about these concepts. But I’ve been feeling like writing, lately, and this is as good an opportunity as any.

The trigger for this particular thing is rather silly and is probably explained more accurately, come to think of it, by “unconsciously registering” something before consciously registering it.

Was having breakfast and started thinking about the importance of being environmentally responsible, the paradox of “consumption as freedom,” the consequences of some lifestyle choices including carfree living, etc. This stream of thought led me, not unexpectedly, to the perspectives on climate change, people’s perception of scientific evidence, and the so-called ClimateGate. I care a lot about critical thinking, regardless of whether or not I agree with a certain idea, so I think the email controversy shows the importance of transparency. So far, nothing unexpected. Within a couple of minutes, I had covered a few of the subjects du jour. And that’s what struck me, because right then, I (over)heard a radio host introduce a guest whose talk is titled:

What is the role of climate scientists in the climate change debate?

Obviously, Tremblay addressed ClimateGate quite directly. So my thoughts were “in phase” with Tremblay’s.

A few minutes prior to (over)hearing this introduction, I (over)heard a comment about topics of social conversations at different points in recent history. According to screenwriter Fabienne Larouche, issues covered in the first seasons of her “flagship” tv series are still at the forefront in Quebec society today, fourteen years later. So I was probably even more “in tune” with the notion of being “in phase.” Especially with my society.

I said “(over)heard” because I wasn’t really listening to that radio show. It was just playing in the background and I wasn’t paying much attention. I don’t tend to listen to live radio but I do listen to some radio recordings as podcasts. One reason I like doing so is that I can pay much closer attention to what I hear. Another is that I can listen to what I want when I feel like listen to it, which means that I can prepare myself for a heady topic or choose some tech-fluff to wind down after a course. There’s also the serendipity of listening to very disparate programmes in the same listening session, as if I were “turning the dial” after each show on a worldwide radio (I often switch between French and English and/or between European and North American sources). For a while now, I’ve been listening to podcasts at double-speed, which helps me focus on what’s most significant.

(In Jazz, we talk about “top notes,” meaning the ones which are more prominent. It’s easier to focus on them at double-speed than at normal speed so “double-times” have an interesting cognitive effect.)

So, I felt “in phase.” As mentioned, it probably has much more to do with having passively heard things without paying attention yet letting it “seep into my brain” to create connections between a few subjects which get me to the same point as what comes later. A large part of this is well-known in psychology, especially in terms of cognition. We start noticing things when they enter into a schema we have in our mind. These things we start noticing were there all along so the “discovery” is only in our mind (in the sense that it wouldn’t be a discovery for others). When we learn a new word, for instance, we start hearing it everywhere.

But there are also words which start being used by everyone because they have been diffused largely at a given point in time. An actual neologism can travel quickly and a word in our passive vocabulary can also come to prominence, especially in mainstream media. Clearly, this is an issue of interest to psychologists, folklorists, and media analysts alike. I’m enough of a folklorist and media observer to think about the social processes behind the diffusion of terms regardless of what psychologists think.

A few months back, I got the impression that the word “nimble” had suddenly increased in currency after it was used in a speech by the current PotUS. Since I’m a non-native speaker of English, I’m likely to be accused of noticing the word because it’s part my own passive vocabulary. I have examples in French, though some are with words which were new to me, at the time («peoplisation», «battante»…). I probably won’t be able to defend myself from those who say that it’s just a matter of my own exposure to those terms. Though there are ways to analyze the currency of a given term, I’m not sure I trust this type of analysis a lot more than my gut feeling, at least in terms of realtime trends.

Which makes me think of “memetics.” Not in the strict sense that Dawkins would like us to use. But in the way popular culture cares about the propagation of “units of thought.” I recently read a fascinating blogpost (in French) about  memetics from this perspective, playing Dawkins against himself. As coincidences keep happening (or, more accurately, as I’m accutely tuned to find coincidences everywhere), I’ve been having a discussion about Mahir‘s personal homepage (aka “I kiss you”), who became an “Internet celebrity” through this process which is now called memetic. The reason his page was noticed isn’t that it was so unique. But it had this je ne sais quoi which captured the imagination, at the time (the latter part of the “Dot-Com Bubble”). As some literary critics and many other humanists teach us, it’s not the item itself which counts, it’s how we receive it (yes, I tend to be on the “reception” and “eye of the beholder” side of things). Mahir was striking because he was, indeed, “out of phase” with the times.

As I think about phase, I keep hearing the other acoustic analogy: the tuning of sine waves. When a sine wave is very slightly “out of tune” with another, we hear a very slow oscillation (interference beats) until they produce resonance. There’s a direct relationship between beat tones and phase, but I think “in tune” and “in phase” remain separate analogies.

One reason I like to think about waves for these analogies is that I tend to perceive temporal change through these concepts. If we think of historical change through cycles, being “in phase” is a matter of matching two change processes until they’re aligned but the cycles may be in harmonic relationships. One can move twice as fast as society and still be “in phase” with it.

Sure, I’m overextending the analogies, and there’s something far-fetched about this. But that’s pretty much what I like about analogical thinking. As I’m under the weather, this kind of rambling is almost therapeutic.


No Office Export in Keynote/Numbers for iPad?

To be honest, I’m getting even more excited about the iPad. Not that we get that much more info about it, but:

For one thing, the Pages for iPad webpage is explicitly stating Word support:

Attach them to an email as Pages files for Mac, Microsoft Word files, or PDF documents.

Maybe this is because Steve Jobs himself promised it to Walt Mossberg?
Thing is, the equivalent pages about Keynote for iPad and about Numbers for iPad aren’t so explicit:

The presentations you create in Keynote on your iPad can be exported as Keynote files for Mac or PDF documents

and…

To share your work, export your spreadsheet as a Numbers file for Mac or PDF document

Not a huge issue, but it seems strange that Apple would have such an “export to Microsoft Office” feature on only one of the three “iWork for iPad” apps. Now, the differences in the way exports are described may not mean that Keynote won’t be able to export to Microsoft PowerPoint or that Numbers won’t be able to export to Microsoft Excel. After all, these texts may have been written at different times. But it does sound like PowerPoint and Excel will be import-only, on the iPad.

Which, again, may not be that big an issue. Maybe iWork.com will work well enough for people’s needs. And some other cloud-based tools do support Keynote. (Though Google Docs and Zoho Show don’t.)

The reason I care is simple: I do share most of my presentation files. Either to students (as resources on Moodle) or to whole wide world (through Slideshare). My desktop outliner of choice, OmniOutliner, exports to Keynote and Microsoft Word. My ideal workflow would be to send, in parallel, presentation files to Keynote for display while on stage and to PowerPoint for sharing. The Word version could also be useful for sharing.

Speaking of presenting “slides” on stage, I’m also hoping that the “iPad Dock Connector to VGA Adapter” will support “presenter mode” at some point (though it doesn’t seem to be the case, right now). I also dream of a way to control an iPad presentation with some kind of remote. In fact, it’s not too hard to imagine it as an iPod touch app (maybe made by Appiction, down in ATX).

To be clear: my “presentation files” aren’t really about presenting so much as they are a way to package and organize items. Yes, I use bullet points. No, I don’t try to make the presentation sexy. My presentation files are acting like cue cards and like whiteboard snapshots. During a class, I use the “slides” as a way to keep track of where I planned the discussion to go. I can skip around, but it’s easier for me to get at least some students focused on what’s important (the actual depth of the discussion) because they know the structure (as “slides”) will be available online. Since I also podcast my lectures, it means that they can go back to all the material.

I also use “slides” to capture things we build in class, such as lists of themes from the readings or potential exam questions.  Again, the “whiteboard” idea. I don’t typically do the same thing during a one-time talk (say, at an unconference). But I still want to share my “slides,” at some point.

So, in all of these situations, I need a file format for “slides.” I really wish there were a format which could work directly out of the browser and could be converted back and forth with other formats (especially Keynote, OpenOffice, and PowerPoint). I don’t need anything fancy. I don’t even care about transitions, animations, or even inserting pictures. But, despite some friends’ attempts at making me use open solutions, I end up having to use presentation files.

Unfortunately, at this point, PowerPoint is the de facto standard for presentation files. So I need it, somehow. Not that I really need PowerPoint itself. But it’s still the only format I can use to share “slides.”

So, if Keynote for iPad doesn’t export directly to PowerPoint, it means that I’ll have to find another way to make my workflow fit.

Ah, well…


I Hate Books

In a way, this is a followup to a discussion happening on Facebook after something I posted (available publicly on Twitter): “(Alexandre) wishes physical books a quick and painfree death. / aime la connaissance.”

As I expected, the reactions I received were from friends who are aghast: how dare I dismiss physical books? Don’t I know no shame?

Apparently, no, not in this case.

And while I posted it as a quip, it’s the result of a rather long reflection. It’s not that I’m suddenly anti-books. It’s that I stopped buying several of the “pro-book” arguments a while ago.

Sure, sure. Books are the textbook case of technlogy which needs no improvement. eBooks can’t replace the experience of doing this or that with a book. But that’s what folkloristics defines as a functional shift. Like woven baskets which became objects of nostalgia, books are being maintained as the model for a very specific attitude toward knowledge construction based on monolithic authored texts vetted by gatekeepers and sold as access to information.

An important point, here, is that I’m not really thinking about fiction. I used to read two novel-length works a week (collections of short stories, plays…), for a period of about 10 years (ages 13 to 23). So, during that period, I probably read about 1,000 novels, ranging from Proust’s Recherche to Baricco’s Novecentoand the five books of Rabelais’s Pantagruel series. This was after having read a fair deal of adolescent and young adult fiction. By today’s standards, I might be considered fairly well-read.

My life has changed a lot, since that time. I didn’t exactly stop reading fiction but my move through graduate school eventually shifted my reading time from fiction to academic texts. And I started writing more and more, online and offline.
In the same time, the Web had also been making me shift from pointed longform texts to copious amounts of shortform text. Much more polyvocal than what Bakhtin himself would have imagined.

(I’ve also been shifting from French to English, during that time. But that’s almost another story. Or it’s another part of the story which can reamin in the backdrop without being addressed directly at this point. Ask, if you’re curious.)
The increase in my writing activity is, itself, a shift in the way I think, act, talk… and get feedback. See, the fact that I talk and write a lot, in a variety of circumstances, also means that I get a lot of people to play along. There’s still a risk of groupthink, in specific contexts, but one couldn’t say I keep getting things from the same perspective. In fact, the very Facebook conversation which sparked this blogpost is an example, as the people responding there come from relatively distant backgrounds (though there are similarities) and were not specifically queried about this. Their reactions have a very specific value, to me. Sure, it comes in the form of writing. But it’s giving me even more of something I used to find in writing: insight. The stuff you can’t get through Google.

So, back to books.

I dislike physical books. I wish I didn’t have to use them to read what I want to read. I do have a much easier time with short reading sessions on a computer screen that what would turn into rather long periods of time holding a book in my hands.

Physical books just don’t do it for me, anymore. The printing press is, like, soooo 1454!

Yes, books had “a good run.” No, nothing replaces them. That’s not the way it works. Movies didn’t replace theater, television didn’t replace radio, automobiles didn’t replace horses, photographs didn’t replace paintings, books didn’t replace orality. In fact, the technology itself doesn’t do much by itself. But social contexts recontextualize tools. If we take technology to be the set of both tools and the knowledge surrounding it, technology mostly goes through social processes, since tool repertoires and corresponding knowledge mostly shift in social contexts, not in their mere existence. Gutenberg’s Bible was a “game-changer” for social, as well as technical reasons.

And I do insist on orality. Journalists and other “communication is transmission of information” followers of Shannon&Weaver tend to portray writing as the annihilation of orality. How long after the invention of writing did Homer transfer an oral tradition to the writing media? Didn’t Albert Lord show the vitality of the epic well into the 20th Century? Isn’t a lot of our knowledge constructed through oral means? Is Internet writing that far, conceptually, from orality? Is literacy a simple on/off switch?

Not only did I maintain an interest in orality through the most book-focused moments of my life but I probably care more about orality now than I ever did. So I simply cannot accept the idea that books have simply replaced the human voice. It doesn’t add up.

My guess is that books won’t simply disappear either. There should still be a use for “coffee table books” and books as gifts or collectables. Records haven’t disappeared completely and CDs still have a few more days in dedicated stores. But, in general, we’re moving away from the “support medium” for “content” and more toward actual knowledge management in socially significant contexts.

In these contexts, books often make little sense. Reading books is passive while these contexts are about (hyper-)/(inter-)active.

Case in point (and the reason I felt compelled to post that Facebook/Twitter quip)…
I hear about a “just released” French book during a Swiss podcast. Of course, it’s taken a while to write and publish. So, by the time I heard about it, there was no way to participate in the construction of knowledge which led to it. It was already “set in stone” as an “opus.”

Looked for it at diverse bookstores. One bookstore could eventually order it. It’d take weeks and be quite costly (for something I’m mostly curious about, not depending on for something really important).

I eventually find it in the catalogue at BANQ. I reserve it. It wasn’t on the shelves, yet, so I had to wait until it was. It took from November to February. I eventually get a message that I have a couple of days to pick up my reservation but I wasn’t able to go. So it went back on the “just released” shelves. I had the full call number but books in that section aren’t in their call number sequence. I spent several minutes looking back and forth between eight shelves to eventually find out that there were four more shelves in the “humanities and social sciences” section. The book I was looking was on one of those shelves.

So, I was able to borrow it.

Phew!

In the metro, I browse through it. Given my academic reflex, I look for the back matter first. No bibliography, no index, a ToC with rather obscure titles (at random: «Taylor toujours à l’œuvre»/”Taylor still at work,” which I’m assuming to be a reference to continuing taylorism). The book is written by two separate dudes but there’s no clear indication of who wrote what. There’s a preface (by somebody else) but no “acknowledgments” section, so it’s hard to see who’s in their network. Footnotes include full URLs to rather broad sites as well as “discussion with <an author’s name>.” The back cover starts off with references to French popular culture (including something about “RER D,” which would be difficult to search). Information about both authors fits in less than 40 words (including a list of publication titles).

The book itself is fairly large print, ways almost a pound (422g, to be exact) for 327 pages (including front and back matter). Each page seems to be about 50 characters per line, about 30 lines per page. So, about half a million characters or 3500 tweets (including spaces). At 5+1 characters per word, about 80,000 words (I have a 7500-words blogpost, written in an afternoon). At about 250 words per minute, about five hours of reading. This book is listed at 19€ (about 27CAD).
There’s no direct way to do any “postprocessing” with the text: no speech synthesis for visually impaired, concordance analysis, no machine translation, even a simple search for occurences of “Sarkozy” is impossible. Not to mention sharing quotes with students or annotating in an easy-to-retrieve fashion (à la Diigo).

Like any book, it’s impossible to read in the dark and I actually have a hard time to find a spot where I can read with appropriate lighting.

Flipping through the book, I get the impression that there’s some valuable things to spark discussions, but there’s also a whole lot of redundancy with frequent discussions on the topic (the Future of Journalism, or #FoJ, as a matter of fact). My guesstimate is that, out of 5 hours of reading, I’d get at most 20 pieces of insight that I’d have exactly no way to find elsewhere. Comparable books to which I listened as audiobooks, recently, had much less. In other words, I’d have at most 20 tweets worth of things to say from the book. Almost a 200:1 compression.
Direct discussion with the authors could produce much more insight. The radio interviews with these authors already contained a few insight hints, which predisposed me to look for more. But, so many months later, without the streams of thought which animated me at the time, I end up with something much less valuable than what I wanted to get, back in November.

Bottomline: Books aren’t necessarily “broken” as a tool. They just don’t fit my life, anymore.


Why I Need an iPad

I’m one of those who feel the iPad is the right tool for the job.

This is mostly meant as a reply to this blogthread. But it’s also more generally about my personal reaction to Apple’s iPad announcement.

Some background.

I’m an ethnographer and a teacher. I read a fair deal, write a lot of notes, and work in a variety of contexts. These days, I tend to spend a good amount of time in cafés and other public places where I like to work without being too isolated. I also commute using public transit, listen to lots of podcast, and create my own. I’m also very aural.

I’ve used a number of PDAs, over the years, from a Newton MessagePad 130 (1997) to a variety of PalmOS devices (until 2008). In fact, some people readily associated me with PDA use.

As soon as I learnt about the iPod touch, I needed one. As soon as I’ve heard about the SafariPad, I wanted one. I’ve been an intense ‘touch user since the iPhone OS 2.0 release and I’m a happy camper.

(A major reason I never bought an iPhone, apart from price, is that it requires a contract.)

In my experience, the ‘touch is the most appropriate device for all sorts of activities which are either part of an other activity (reading during a commute) or are simply too short in duration to constitute an actual “computer session.” You don’t “sit down to work at your ‘touch” the way you might sit in front of a laptop or desktop screen. This works great for “looking up stufff” or “checking email.” It also makes a lot of sense during commutes in crowded buses or metros.

In those cases, the iPod touch is almost ideal. Ubiquitous access to Internet would be nice, but that’s not a deal-breaker. Alternative text-input methods would help in some cases, but I do end up being about as fast on my ‘touch as I was with Graffiti on PalmOS.

For other tasks, I have a Mac mini. Sure, it’s limited. But it does the job. In fact, I have no intention of switching for another desktop and I even have an eMachines collecting dust (it’s too noisy to make a good server).

What I miss, though, is a laptop. I used an iBook G3 for several years and loved it. For a little while later, I was able to share a MacBook with somebody else and it was a wonderful experience. I even got to play with the OLPC XO for a few weeks. That one was not so pleasant an experience but it did give me a taste for netbooks. And it made me think about other types of iPhone-like devices. Especially in educational contexts. (As I mentioned, I’m a teacher)

I’ve been laptop-less for a while, now. And though my ‘touch replaces it in many contexts, there are still times when I’d really need a laptop. And these have to do with what I might call “mobile sessions.”

For instance: liveblogging a conference or meeting. I’ve used my ‘touch for this very purpose on a good number of occasions. But it gets rather uncomfortable, after a while, and it’s not very fast. A laptop is better for this, with a keyboard and a larger form factor. But the iPad will be even better because of lower risks of RSI. A related example: just imagine TweetDeck on iPad.

Possibly my favourite example of a context in which the iPad will be ideal: presentations. Even before learning about the prospect of getting iWork on a tablet, presentations were a context in which I really missed a laptop.

Sure, in most cases, these days, there’s a computer (usually a desktop running XP) hooked to a projector. You just need to download your presentation file from Slideshare, show it from Prezi, or transfer it through USB. No biggie.

But it’s not the extra steps which change everything. It’s the uncertainty. Even if it’s often unfounded, I usually get worried that something might just not work, along the way. The slides might not show the same way as you see it because something is missing on that computer or that computer is simply using a different version of the presentation software. In fact, that software is typically Microsoft PowerPoint which, while convenient, fits much less in my workflow than does Apple Keynote.

The other big thing about presentations is the “presenter mode,” allowing you to get more content than (or different content from) what the audience sees. In most contexts where I’ve used someone else’s computer to do a presentation, the projector was mirroring the computer’s screen, not using it as a different space. PowerPoint has this convenient “presenter view” but very rarely did I see it as an available option on “the computer in the room.” I wish I could use my ‘touch to drive presentations, which I could do if I installed software on that “computer in the room.” But it’s not something that is likely to happen, in most cases.

A MacBook solves all of these problems. and it’s an obvious use for laptops. But how, then, is the iPad better? Basically because of interface. Switching slides on a laptop isn’t hard, but it’s more awkward than we realize. Even before watching the demo of Keynote on the iPad, I could simply imagine the actual pleasure of flipping through slides using a touch interface. The fit is “natural.”

I sincerely think that Keynote on the iPad will change a number of things, for me. Including the way I teach.

Then, there’s reading.

Now, I’m not one of those people who just can’t read on a computer screen. In fact, I even grade assignments directly from the screen. But I must admit that online reading hasn’t been ideal, for me. I’ve read full books as PDF files or dedicated formats on PalmOS, but it wasn’t so much fun, in terms of the reading process. And I’ve used my ‘touch to read things through Stanza or ReadItLater. But it doesn’t work so well for longer reading sessions. Even in terms of holding the ‘touch, it’s not so obvious. And, what’s funny, even a laptop isn’t that ideal, for me, as a reading device. In a sense, this is when the keyboard “gets in the way.”

Sure, I could get a Kindle. I’m not a big fan of dedicated devices and, at least on paper, I find the Kindle a bit limited for my needs. Especially in terms of sources. I’d like to be able to use documents in a variety of formats and put them in a reading list, for extended reading sessions. No, not “curled up in bed.” But maybe lying down in a sofa without external lighting. Given my experience with the ‘touch, the iPad is very likely the ideal device for this.

Then, there’s the overall “multi-touch device” thing. People have already been quite creative with the small touchscreen on iPhones and ‘touches, I can just imagine what may be done with a larger screen. Lots has been said about differences in “screen real estate” in laptop or desktop screens. We all know it can make a big difference in terms of what you can display at the same time. In some cases, two screens isn’t even a luxury, for instance when you code and display a page at the same time (LaTeX, CSS…). Certainly, the same qualitative difference applies to multitouch devices. Probably even more so, since the display is also used for input. What Han found missing in the iPhone’s multitouch was the ability to use both hands. With the iPad, Han’s vision is finding its space.

Oh, sure, the iPad is very restricted. For instance, it’s easy to imagine how much more useful it’d be if it did support multitasking with third-party apps. And a front-facing camera is something I was expecting in the first iPhone. It would just make so much sense that a friend seems very disappointed by this lack of videoconferencing potential. But we’re probably talking about predetermined expectations, here. We’re comparing the iPad with something we had in mind.

Then, there’s the issue of the competition. Tablets have been released and some multitouch tablets have recently been announced. What makes the iPad better than these? Well, we could all get in the same OS wars as have been happening with laptops and desktops. In my case, the investment in applications, files, and expertise that I have made in a Mac ecosystem rendered my XP years relatively uncomfortable and me appreciate returning to the Mac. My iPod touch fits right in that context. Oh, sure, I could use it with a Windows machine, which is in fact what I did for the first several months. But the relationship between the iPhone OS and Mac OS X is such that using devices in those two systems is much more efficient, in terms of my own workflow, than I could get while using XP and iPhone OS. There are some technical dimensions to this, such as the integration between iCal and the iPhone OS Calendar, or even the filesystem. But I’m actually thinking more about the cognitive dimensions of recognizing some of the same interface elements. “Look and feel” isn’t just about shiny and “purty.” It’s about interactions between a human brain, a complex sensorimotor apparatus, and a machine. Things go more quickly when you don’t have to think too much about where some tools are, as you’re working.

So my reasons for wanting an iPad aren’t about being dazzled by a revolutionary device. They are about the right tool for the job.


Judging Coffee and Beer: Answer to DoubleShot Coffee Company

DoubleShot Coffee Company: More Espresso Arguments.

I’m not in the coffee biz but I do involve myself in some coffee-related things, including barista championships (sensory judge at regional and national) and numerous discussions with coffee artisans. In other words, I’m nobody important.

In a way, I “come from” the worlds of beer and coffee homebrewing. In coffee circles, I like to introduce myself as a homeroaster and blogger.

(I’m mostly an ethnographer, meaning that I do what we call “participant-observation” as both an insider and an outsider.)

There seem to be several disconnects in today’s coffee world, despite a lot of communication across the Globe. Between the huge coffee corporations and the “specialty coffee” crowd. Between coffee growers and coffee lovers. Between professional and home baristas. Even, sometimes, between baristas from different parts of the world.
None of it is very surprising. But it’s sometimes a bit sad to hear people talk past one another.

I realize nothing I say may really help. And it may all be misinterpreted. That’s all part of the way things go and I accept that.

In the world of barista champions and the so-called “Third Wave,” emotions seem particularly high. Part of it might have to do with the fact that so many people interact on a rather regular basis. Makes for a very interesting craft, in some ways. But also for rather tense moments.

About judging…
My experience isn’t that extensive. I’ve judged at the Canadian Eastern Regional BC twice and at the Canadian BC once.
Still, I did notice a few things.

One is that there can be a lot of camaraderie/collegiality among BC participants. This can have a lot of beneficial effects on the quality of coffee served in different places as well as on the quality of the café experience itself, long after the championships. A certain cohesiveness which may come from friendly competition can do a lot for the diversity of coffee scenes.

Another thing I’ve noticed is that it’s really easy to be fair, in judging using WBC regulations. It’s subjective in a very literal way since there’s tasting involved (tastebuds belong to the “subjects” of the sensory and head judges). But it simply has very little if anything to do with personal opinions, relationships, or “liking the person.” It’s remarkably easy to judge the performance, with a focus on what’s in the cup, as opposed to the person her-/himself or her/his values.

Sure, the championship setting is in many ways artificial and arbitrary. A little bit like rules for an organized sport. Or so many other contexts.

A competition like this has fairly little to do with what is likely to happen in “The Real World” (i.e., in a café). I might even say that applying a WBC-compatible in a café is likely to become a problem in many cases. A bit like working the lunch shift at a busy diner using ideas from the Iron Chef or getting into a street fight and using strict judo rules.

A while ago, I was working in French restaurants, as a «garde-manger» (assistant-chef). We often talked about (and I did meet a few) people who were just coming out of culinary institutes. In most cases, they were quite good at producing a good dish in true French cuisine style. But the consensus was that “they didn’t know how to work.”
People fresh out of culinary school didn’t really know how to handle a chaotic kitchen, order only the supplies required, pay attention to people’s tastes, adapt to differences in prices, etc. They could put up a good show and their dishes might have been exquisite. But they could also be overwhelmed with having to serve 60 customers in a regular shift or, indeed, not know what to do during a slow night. Restaurant owners weren’t that fond of hiring them, right away. They had to be “broken out” («rodés»).

Barista championships remind me of culinary institutes, in this way. Both can be useful in terms of skills, but experience is more diverse than that.

So, yes, WBC rules are probably artificial and arbitrary. But it’s easy to be remarkably consistent in applying these rules. And that should count for something. Just not for everythin.

Sure, you may get some differences between one judge and the other. But those differences aren’t that difficult to understand and I didn’t see that they tended to have to do with “preferences,” personal issues, or anything of the sort. From what I noticed while judging, you simply don’t pay attention to the same things as when you savour coffee. And that’s fine. Cupping coffee isn’t the same thing as drinking it, either.

In my (admittedly very limited) judging experience, emphasis was put on providing useful feedback. The points matter a lot, of course, but the main thing is that the points make sense in view of the comments. In a way, it’s to ensure calibration (“you say ‘excellent’ but put a ‘3,’ which one is more accurate?”) but it’s also about the goals of the judging process. The textual comments are a way to help the barista pay attention to certain things. “Constructive criticism” is one way to put it. But it’s more than that. It’s a way to get something started.

Several of the competitors I’ve seen do come to ask judges for clarifications and many of them seemed open to discussion. A few mostly wanted justification and may have felt slighted. But I mostly noticed a rather thoughtful process of debriefing.

Having said that, there are competitors who are surprised by differences between two judges’ scores. “But both shots came from the same portafilter!” “Well, yes, but if you look at the video, you’ll notice that coffee didn’t flow the same way in both cups.” There are also those who simply doubt judges, no matter what. Wonder if they respect people who drink their espresso…

Coming from the beer world, I also notice differences with beer. In the beer world, there isn’t really an equivalent to the WBC in the sense that professional beer brewers don’t typically have competitions. But amateur homebrewers do. And it’s much stricter than the WBC in terms of certification. It requires a lot of rote memorization, difficult exams (I helped proctor two), judging points, etc.

I’ve been a vocal critic of the Beer Judge Certification Program. There seems to be an idea, there, that you can make the process completely neutral and that the knowledge necessary to judge beers is solid and well-established. One problem is that this certification program focuses too much on a series of (over a hundred) “styles” which are more of a context-specific interpretation of beer diversity than a straightforward classification of possible beers.
Also, the one thing they want to avoid the most (basing their evaluation on taste preferences) still creeps in. It’s probably no coincidence that, at certain events, beers which were winning “Best of Show” tended to be big, assertive beers instead of very subtle ones. Beer judges don’t want to be human, but they may still end up acting like ones.

At the same time, while there’s a good deal of debate over beer competition results and such, there doesn’t seem to be exactly the same kind of tension as in barista championships. Homebrewers take their results to heart and they may yell at each other over their scores. But, somehow, I see much less of a fracture, “there” than “here.” Perhaps because the stakes are very low (it’s a hobby, not a livelihood). Perhaps because beer is so different from coffee. Or maybe because there isn’t a sense of “Us vs. Them”: brewers judging a competition often enter beer in that same competition (but in a separate category from the ones they judge).
Actually, the main difference may be that beer judges can literally only judge what’s in the bottle. They don’t observe the brewers practicing their craft (this happens weeks prior), they simply judge the product. In a specific condition. In many ways, it’s very unfair. But it can help brewers understand where something went wrong.

Now, I’m not saying the WBC should become like the BJCP. For one thing, it just wouldn’t work. And there’s already a lot of investment in the current WBC format. And I’m really not saying the BJCP is better than the WBC as an inspiration, since I actually prefer the WBC-style championships. But I sense that there’s something going on in the coffee world which has more to do with interpersonal relationships and “attitudes” than with what’s in the cup.

All this time, those of us who don’t make a living through coffee but still live it with passion may be left out. And we do our own things. We may listen to coffee podcasts, witness personal conflicts between café owners, hear rants about the state of the “industry,” and visit a variety of cafés.
Yet, slowly but surely, we’re making our own way through coffee. Exploring its diversity, experimenting with different brewing methods, interacting with diverse people involved, even taking trips “to origin”…

Coffee is what unites us.


Installing BuddyPress on a Webhost

[Jump here for more technical details.]

A few months ago, I installed BuddyPress on my Mac to try it out. It was a bit of an involved process, so I documented it:

WordPress MU, BuddyPress, and bbPress on Local Machine « Disparate.

More recently, I decided to get a webhost. Both to run some tests and, eventually, to build something useful. BuddyPress seems like a good way to go at it, especially since it’s improved a lot, in the past several months.

In fact, the installation process is much simpler, now, and I ran into some difficulties because I was following my own instructions (though adapting the process to my webhost). So a new blogpost may be in order. My previous one was very (possibly too) detailed. This one is much simpler, technically.

One thing to make clear is that BuddyPress is a set of plugins meant for WordPress µ (“WordPress MU,” “WPMU,” “WPµ”), the multi-user version of the WordPress blogging platform. BP is meant as a way to make WPµ more “social,” with such useful features as flexible profiles, user-to-user relationships, and forums (through bbPress, yet another one of those independent projects based on WordPress).

While BuddyPress depends on WPµ and does follow a blogging logic, I’m thinking about it as a social platform. Once I build it into something practical, I’ll probably use the blogging features but, in a way, it’s more of a tool to engage people in online social activities. BuddyPress probably doesn’t work as a way to “build a community” from scratch. But I think it can be quite useful as a way to engage members of an existing community, even if this engagement follows a blogger’s version of a Pareto distribution (which, hopefully, is dissociated from elitist principles).

But I digress, of course. This blogpost is more about the practical issue of adding a BuddyPress installation to a webhost.

Webhosts have come a long way, recently. Especially in terms of shared webhosting focused on LAMP (or PHP/MySQL, more specifically) for blogs and content-management. I don’t have any data on this, but it seems to me that a lot of people these days are relying on third-party webhosts instead of relying on their own servers when they want to build on their own blogging and content-management platforms. Of course, there’s a lot more people who prefer to use preexisting blog and content-management systems. For instance, it seems that there are more bloggers on WordPress.com than on other WordPress installations. And WP.com blogs probably represent a small number of people in comparison to the number of people who visit these blogs. So, in a way, those who run their own WordPress installations are a minority in the group of active WordPress bloggers which, itself, is a minority of blog visitors. Again, let’s hope this “power distribution” not a basis for elite theory!

Yes, another digression. I did tell you to skip, if you wanted the technical details!

I became part of the “self-hosted WordPress” community through a project on which I started work during the summer. It’s a website for an academic organization and I’m acting as the organization’s “Web Guru” (no, I didn’t choose the title). The site was already based on WordPress but I was rebuilding much of it in collaboration with the then-current “Digital Content Editor.” Through this project, I got to learn a lot about WordPress, themes, PHP, CSS, etc. And it was my first experience using a cPanel- (and Fantastico-)enabled webhost (BlueHost, at the time). It’s also how I decided to install WordPress on my local machine and did some amount of work from that machine.

But the local installation wasn’t an ideal solution for two reasons: a) I had to be in front of that local machine to work on this project; and b) it was much harder to show the results to the person with whom I was collaborating.

So, in the Fall, I decided to get my own staging server. After a few quick searches, I decided HostGator, partly because it was available on a monthly basis. Since this staging server was meant as a temporary solution, HG was close to ideal. It was easy to set up as a PayPal “subscription,” wasn’t that expensive (9$/month), had adequate support, and included everything that I needed at that point to install a current version of WordPress and play with theme files (after importing content from the original site). I’m really glad I made that decision because it made a number of things easier, including working from different computers, and sending links to get feedback.

While monthly HostGator fees were reasonable, it was still a more expensive proposition than what I had in mind for a longer-term solution. So, recently, a few weeks after releasing the new version of the organization’s website, I decided to cancel my HostGator subscription. A decision I made without any regret or bad feeling. HostGator was good to me. It’s just that I didn’t have any reason to keep that account or to do anything major with the domain name I was using on HG.

Though only a few weeks elapsed since I canceled that account, I didn’t immediately set out to transition to a new webhost. I didn’t go from HostGator to another webhost.

But having my own webhost still remained at the back of my mind as something which might be useful. For instance, while not really making a staging server necessary, a new phase in the academic website project brought up a sandboxing idea. Also, I went to a “WordPress Montreal” meeting and got to think about further WordPress development/deployment, including using BuddyPress for my own needs (both as my own project and as a way to build my own knowledge of the platform) instead of it being part of an organization’s project. I was also thinking about other interesting platforms which necessitate a webhost.

(More on these other platforms at a later point in time. Bottom line is, I’m happy with the prospects.)

So I wanted a new webhost. I set out to do some comparison shopping, as I’m wont to do. In my (allegedly limited) experience, finding the ideal webhost is particularly difficult. For one thing, search results are cluttered with a variety of “unuseful” things such as rants, advertising, and limited comparisons. And it’s actually not that easy to give a new webhost a try. For one thing, these hosting companies don’t necessarily have the most liberal refund policies you could imagine. And, switching a domain name between different hosts and registrars is a complicated process through which a name may remain “hostage.” Had I realized what was involved, I might have used a domain name to which I have no attachment or actually eschewed the whole domain transition and just try the webhost without a dedicated domain name.

Doh!
Live and learn. I sure do. Loving almost every minute of it.

At any rate, I had a relatively hard time finding my webhost.

I really didn’t need “bells and whistles.” For instance, all the AdSense, shopping cart, and other business-oriented features which seem to be publicized by most webhosting companies have no interest, to me.

I didn’t even care so much about absolute degree of reliability or speed. What I’m to do with this host is fairly basic stuff. The core idea is to use my own host to bypass some limitations. For instance, WordPress.com doesn’t allow for plugins yet most of the WordPress fun has to do with plugins.

I did want an “unlimited” host, as much as possible. Not because expect to have huge resource needs but I just didn’t want to have to monitor bandwidth.

I thought that my needs would be basic enough that any cPanel-enabled webhost would fit. As much as I could see, I needed FTP access to something which had PHP 5 and MySQL 5. I expected to install things myself, without use of the webhost’s scripts but I also thought the host would have some useful scripts. Although I had already registered the domain I wanted to use (through Name.com), I thought it might be useful to have a free domain in the webhosting package. Not that domain names are expensive, it’s more of a matter of convenience in terms of payment or setup.

I ended up with FatCow. But, honestly, I’d probably go with a different host if I were to start over (which I may do with another project).

I paid 88$ for two years of “unlimited” hosting, which is quite reasonable. And, on paper, FatCow has everything I need (and I bunch of things I don’t need). The missing parts aren’t anything major but have to do with minor annoyances. In other words, no real deal-breaker, here. But there’s a few things I wish I had realized before I committed on FatCow with a domain name I actually want to use.

Something which was almost a deal-breaker for me is the fact that FatCow requires payment for any additional subdomain. And these aren’t cheap: the minimum is 5$/month for five subdomains, up to 25$/month for unlimited subdomains! Even at a “regular” price of 88$/year for the basic webhosting plan, the “unlimited subdomains” feature (included in some webhosting plans elsewhere) is more than three times more expensive than the core plan.

As I don’t absolutely need extra subdomains, this is mostly a minor irritant. But it’s one reason I’ll probably be using another webhost for other projects.

Other issues with FatCow are probably not enough to motivate a switch.

For instance, the PHP version installed on FatCow (5.2.1) is a few minor releases behind the one needed by some interesting web applications. No biggie, especially if PHP is updated in a relatively reasonable timeframe. But still makes for a slight frustration.

The MySQL version seems recent enough, but it uses non-standard tools to manage it, which makes for some confusion. Attempting to create some MySQL databases with obvious names (say “wordpress”) fails because the database allegedly exists (even though it doesn’t show up in the MySQL administration). In the same vein, the URL of the MySQL is <username>.fatcowmysql.com instead of localhost as most installers seem to expect. Easy to handle once you realize it, but it makes for some confusion.

In terms of Fantastico-like simplified installation of webapps, FatCow uses InstallCentral, which looks like it might be its own Fantastico replacement. InstallCentral is decent enough as an installation tool and FatCow does provide for some of the most popular blog and CMS platforms. But, in some cases, the application version installed by FatCow is old enough (2005!)  that it requires multiple upgrades to get to a current version. Compared to other installation tools, FatCow’s InstallCentral doesn’t seem really efficient at keeping track of installed and released versions.

Something which is partly a neat feature and partly a potential issue is the way FatCow handles Apache-related security. This isn’t something which is so clear to me, so I might be wrong.

Accounts on both BlueHost and HostGator include a public_html directory where all sorts of things go, especially if they’re related to publicly-accessible content. This directory serves as the website’s root, so one expects content to be available there. The “index.html” or “index.php” file in this directory serves as the website’s frontpage. It’s fairly obvious, but it does require that one would understand a few things about webservers. FatCow doesn’t seem to create a public_html directory in a user’s server space. Or, more accurately, it seems that the root directory (aka ‘/’) is in fact public_html. In this sense, a user doesn’t have to think about which directory to use to share things on the Web. But it also means that some higher-level directories aren’t available. I’ve already run into some issues with this and I’ll probably be looking for a workaround. I’m assuming there’s one. But it’s sometimes easier to use generally-applicable advice than to find a custom solution.

Further, in terms of access control… It seems that webapps typically make use of diverse directories and .htaccess files to manage some forms of access controls. Unix-style file permissions are also involved but the kind of access needed for a web app is somewhat different from the “User/Group/All” of Unix filesystems. AFAICT, FatCow does support those .htaccess files. But it has its own tools for building them. That can be a neat feature, as it makes it easier, for instance, to password-protect some directories. But it could also be the source of some confusion.

There are other issues I have with FatCow, but it’s probably enough for now.

So… On to the installation process… 😉

It only takes a few minutes and is rather straightforward. This is the most verbose version of that process you could imagine…

Surprised? 😎

Disclaimer: I’m mostly documenting how I did it and there are some things about which I’m unclear. So it may not work for you. If it doesn’t, I may be able to help but I provide no guarantee that I will. I’m an anthropologist, not a Web development expert.

As always, YMMV.

A few instructions here are specific to FatCow, but the general process is probably valid on other hosts.

I’m presenting things in a sequence which should make sense. I used a slightly different order myself, but I think this one should still work. (If it doesn’t, drop me a comment!)

In these instructions, straight quotes (“”) are used to isolate elements from the rest of the text. They shouldn’t be typed or pasted.

I use “example.com” to refer to the domain on which the installation is done. In my case, it’s the domain name I transfered to FatCow from another registrar but it could probably be done without a dedicated domain (in which case it would be “<username>.fatcow.com” where “<username>” is your FatCow username).

I started with creating a MySQL database for WordPress MU. FatCow does have phpMyAdmin but the default tool in the cPanel is labeled “Manage MySQL.” It’s slightly easier to use for creating new databases than phpMyAdmin because it creates the database and initial user (with confirmed password) in a single, easy-to-understand dialog box.

So I created that new database, user, and password, noting down this information. Since that password appears in clear text at some point and can easily be changed through the same interface, I used one which was easy to remember but wasn’t one I use elsewhere.
Then, I dowloaded the following files to my local machine in order to upload them to my FatCow server space. The upload can be done through either FTP or FatCow’s FileManager. I tend to prefer FTP (via CyberDuck on the Mac or FileZilla on PC). But the FileManager does allow for easy uploads.
(Wish it could be more direct, using the HTTP links directly instead of downloading to upload. But I haven’t found a way to do it through either FTP or the FileManager.)
At any rate, here are the four files I transfered to my FatCow space, using .zip when there’s a choice (the .tar.gz “tarball” versions also work but require a couple of extra steps).
  1. WordPress MU (wordpress-mu-2.9.1.1.zip, in my case)
  2. Buddymatic (buddymatic.0.9.6.3.1.zip, in my case)
  3. EarlyMorning (only one version, it seems)
  4. EarlyMorning-BP (only one version, it seems)

Only the WordPress MU archive is needed to install BuddyPress. The last three files are needed for EarlyMorning, a BuddyPress theme that I found particularly neat. It’s perfectly possible to install BuddyPress without this specific theme. (Although, doing so, you need to install a BuddyPress-compatible theme, if only by moving some folders to make the default theme available, as I explained in point 15 in that previous tutorial.) Buddymatic itself is a theme framework which includes some child themes, so you don’t need to install EarlyMorning. But installing it is easy enough that I’m adding instructions related to that theme.

These files can be uploaded anywhere in my FatCow space. I uploaded them to a kind of test/upload directory, just to make it clear, for me.

A major FatCow idiosyncrasy is its FileManager (actually called “FileManager Beta” in the documentation but showing up as “FileManager” in the cPanel). From my experience with both BlueHost and HostGator (two well-known webhosting companies), I can say that FC’s FileManager is quite limited. One thing it doesn’t do is uncompress archives. So I have to resort to the “Archive Gateway,” which is surprisingly slow and cumbersome.

At any rate, I used that Archive Gateway to uncompress the four files. WordPress µ first (in the root directory or “/”), then both Buddymatic and EarlyMorning in “/wordpress-mu/wp-content/themes” (you can chose the output directory for zip and tar files), and finally EarlyMorning-BP (anywhere, individual files are moved later). To uncompress each file, select it in the dropdown menu (it can be located in any subdirectory, Archive Gateway looks everywhere), add the output directory in the appropriate field in the case of Buddymatic or EarlyMorning, and press “Extract/Uncompress”. Wait to see a message (in green) at the top of the window saying that the file has been uncompressed successfully.

Then, in the FileManager, the contents of the EarlyMorning-BP directory have to be moved to “/wordpress-mu/wp-content/themes/earlymorning”. (Thought they could be uncompressed there directly, but it created an extra folder.) To move those files in the FileManager, I browse to that earlymorning-bp directory, click on the checkbox to select all, click on the “Move” button (fourth from right, marked with a blue folder), and add the output path: /wordpress-mu/wp-content/themes/earlymorning

These files are tweaks to make the EarlyMorning theme work with BuddyPress.

Then, I had to change two files, through the FileManager (it could also be done with an FTP client).

One change is to EarlyMorning’s style.css:

/wordpress-mu/wp-content/themes/earlymorning/style.css

There, “Template: thematic” has to be changed to “Template: buddymatic” (so, “the” should be changed to “buddy”).

That change is needed because the EarlyMorning theme is a child theme of the “Thematic” WordPress parent theme. Buddymatic is a BuddyPress-savvy version of Thematic and this changes the child-parent relation from Thematic to Buddymatic.

The other change is in the Buddymatic “extensions”:

/wordpress-mu/wp-content/themes/buddymatic/library/extensions/buddypress_extensions.php

There, on line 39, “$bp->root_domain” should be changed to “bp_root_domain()”.

This change is needed because of something I’d consider a bug but that a commenter on another blog was kind enough to troubleshoot. Without this modification, the login button in BuddyPress wasn’t working because it was going to the website’s root (example.com/wp-login.php) instead of the WPµ installation (example.com/wordpress-mu/wp-login.php). I was quite happy to find this workaround but I’m not completely clear on the reason it works.

Then, something I did which might not be needed is to rename the “wordpress-mu” directory. Without that change, the BuddyPress installation would sit at “example.com/wordpress-mu,” which seems a bit cryptic for users. In my mind, “example.com/<name>,” where “<name>” is something meaningful like “social” or “community” works well enough for my needs. Because FatCow charges for subdomains, the “<name>.example.com” option would be costly.

(Of course, WPµ and BuddyPress could be installed in the site’s root and the frontpage for “example.com” could be the BuddyPress frontpage. But since I think of BuddyPress as an add-on to a more complete site, it seems better to have it as a level lower in the site’s hierarchy.)

With all of this done, the actual WPµ installation process can begin.

The first thing is to browse to that directory in which WPµ resides, either “example.com/wordpress-mu” or “example.com/<name>” with the “<name>” you chose. You’re then presented with the WordPress µ Installation screen.

Since FatCow charges for subdomains, it’s important to choose the following option: “Sub-directories (like example.com/blog1).” It’s actually by selecting the other option that I realized that FatCow restricted subdomains.

The Database Name, username and password are the ones you created initially with Manage MySQL. If you forgot that password, you can actually change it with that same tool.

An important FatCow-specific point, here, is that “Database Host” should be “<username>.fatcowmysql.com” (where “<username>” is your FatCow username). In my experience, other webhosts use “localhost” and WPµ defaults to that.

You’re asked to give a name to your blog. In a way, though, if you think of BuddyPress as more of a platform than a blogging system, that name should be rather general. As you’re installing “WordPress Multi-User,” you’ll be able to create many blogs with more specific names, if you want. But the name you’re entering here is for BuddyPress as a whole. As with <name> in “example.com/<name>” (instead of “example.com/wordpress-mu”), it’s a matter of personal opinion.

Something I noticed with the EarlyMorning theme is that it’s a good idea to keep the main blog’s name relatively short. I used thirteen characters and it seemed to fit quite well.

Once you’re done filling in this page, WPµ is installed in a flash. You’re then presented with some information about your installation. It’s probably a good idea to note down some of that information, including the full paths to your installation and the administrator’s password.

But the first thing you should do, as soon as you log in with “admin” as username and the password provided, is probably to the change that administrator password. (In fact, it seems that a frequent advice in the WordPress community is to create a new administrator user account, with a different username than “admin,” and delete the “admin” account. Given some security issues with WordPress in the past, it seems like a good piece of advice. But I won’t describe it here. I did do it in my installation and it’s quite easy to do in WPµ.

Then, you should probably enable plugins here:

example.com/<name>/wp-admin/wpmu-options.php#menu

(From what I understand, it might be possible to install BuddyPress without enabling plugins, since you’re logged in as the administrator, but it still makes sense to enable them and it happens to be what I did.)

You can also change a few other options, but these can be set at another point.

One option which is probably useful, is this one:

Allow new registrations Disabled
Enabled. Blogs and user accounts can be created.
Only user account can be created.

Obviously, it’s not necessary. But in the interest of opening up the BuddyPress to the wider world without worrying too much about a proliferation of blogs, it might make sense. You may end up with some fake user accounts, but that shouldn’t be a difficult problem to solve.

Now comes the installation of the BuddyPress plugin itself. You can do so by going here:

example.com/<name>/wp-admin/plugin-install.php

And do a search for “BuddyPress” as a term. The plugin you want was authored by “The BuddyPress Community.” (In my case, version 1.1.3.) Click the “Install” link to bring up the installation dialog, then click “Install Now” to actually install the plugin.

Once the install is done, click the “Activate” link to complete the basic BuddyPress installation.

You now have a working installation of BuddyPress but the BuddyPress-savvy EarlyMorning isn’t enabled. So you need to go to “example.com/<name>/wp-admin/wpmu-themes.php” to enable both Buddymatic and EarlyMorning. You should then go to “example.com/<name>/wp-admin/themes.php” to activate the EarlyMorning theme.

Something which tripped me up because it’s now much easier than before is that forums (provided through bbPress) are now, literally, a one-click install. If you go here:

example.com/<name>/wp-admin/admin.php?page=bb-forums-setup

You can set up a new bbPress install (“Set up a new bbPress installation”) and everything will work wonderfully in terms of having forums fully integrated in BuddyPress. It’s so seamless that I wasn’t completely sure it had worked.

Besides this, I’d advise that you set up a few widgets for the BuddyPress frontpage. You do so through an easy-to-use drag-and-drop interface here:

example.com/<name>/wp-admin/widgets.php

I especially advise you to add the Twitter RSS widget because it seems to me to fit right in. If I’m not mistaken, the EarlyMorning theme contains specific elements to make this widget look good.

After that, you can just have fun with your new BuddyPress installation. The first thing I did was to register a new user. To do so, I logged out of my admin account,  and clicked on the Sign Up button. Since I “allow new registrations,” it’s a very simple process. In fact, this is one place where I think that BuddyPress shines. Something I didn’t explain is that you can add a series of fields for that registration and the user profile which goes with it.

The whole process really shouldn’t take very long. In fact, the longest parts have probably to do with waiting for Archive Gateway.

The rest is “merely” to get people involved in your BuddyPress installation. It can happen relatively easily, if you already have a group of people trying to do things together online. But it can be much more complicated than any software installation process… 😉


Landing On His Feet: Nicolas Chourot

Listening to Nicolas Chourot‘s début album: First Landing (available on iTunes). Now, here’s someone who found his voice.

A few years ago, Nicolas Chourot played with us as part of Madou Diarra & Dakan, a group playing music created for Mali’s hunters’ associations.

Before Chourot joined us, I had been a member of Dakan for several years and my perspective on the group’s music was rather specific. As an ethnomusicologist working on the original context for hunters’ music, I frequently tried to maintain the connection with what makes Malian hunters so interesting, including a certain sense of continuity through widespread changes.

When Nicolas came up with his rather impressive equipment, I began to wonder how it would all fit. A very open-minded, respectful, and personable musician, Nicolas was able to both transform Dakan’s music from within and adapt his playing to a rather distant performance style. Not an easy task for any musician and Nicolas sure was to be commended for such a success.

After a while, Chourot and Dakan’s Madou Diarra parted ways. Still, Nicolas remained a member of the same informal music network as several people who had been in Dakan, including several of my good friends. And though I haven’t seen Nicolas in quite a while, he remains in my mind as someone whose playing and attitude toward music I enjoy.

Unfortunately, I was unable to attend the launch of Nicolas’s launch/show, on August 29. What’s strange is that it took me until today to finally buy Nicolas’s album. Not exactly sure why. Guess my mind was elsewhere. For months.

Ah, well… Désolé Nicolas!

But I did finally get the album. And I’m really glad I did!

When I first heard Nicolas’s playing, I couldn’t help but think about Michel Cusson. I guess it was partly because both have been fusing Jazz and “World” versions of the electric guitar. But there was something else in Nicolas’s playing that I readily associated with Cusson. Never analyzed it. Nor am I planning to analyze it at any point. Despite my music school background and ethnomusicological training, I’ve rarely been one for formal analysis. But there’s something intriguing, there, as a connection. It’s not “imitation as sincerest form of flattery”: Chourot wasn’t copying Cusson. But it seemed like both were “drinking from the same spring,” so to speak.

In First Landing, this interpretation comes back to my mind.

See, not only does Chourot’s playing still have some Cussonisms, but I hear other voices connected to Cusson’s. Including that of Cusson’s former bandmate Alain Caron And even Uzeb itself, the almost mythical band which brought Caron and Cusson together.

For a while, in the 1980s, Uzeb dominated a large part of Quebec’s local Jazz market. At the time, other Jazz players were struggling to get some recognition. As they do now. To an extent, Uzeb was a unique phenomenon in Quebec’s musical history since, despite their diversity and the quality of their work, Quebec’s Jazz musicians haven’t become mainstream again. Which might be a good thing but bears some reflection. What was so special about Uzeb? Why did it disappear? Can’t other Jazz acts fill the space left by Uzeb, after all these years?

I don’t think it’s what Nicolas is trying to do. But if he were, First Landing would be the way to go at it. It doesn’t “have all the ingredients.” That wouldn’t work. But, at the risk of sounding like an old cub scout, it has “the Uzeb spirit.”

Which brings me to other things I hear. Other bands with distinct, if indirect, Uzebian connections.

One is Jazzorange, which was a significant part of Lausanne’s Jazz scene when I was living there.My good friend Vincent Jaton introduced to Jazzorange in 1994 and Uzeb’s alumni Caron and Cusson were definitely on my mind at the time.

Vincent, musician and producer extraordinaire, introduced me to a number of musicians and I owe him a huge debt for helping me along a path to musical (self-)discovery. Vincent’s own playing also shares a few things with what I hear in First Landing, but the connection with Jazzorange is more obvious, to me.

Another band I hear in connection to Chourot’s playing is Sixun. That French band, now 25 years old, is probably among the longest-lasting acts in this category of Jazz. Some Jazz ensembles are older (including one of my favourites, Oregon). But Sixun is a key example of what some people call “Jazz Fusion.”

Which is a term I avoided, as I mentioned diverse musicians. Not because I personally dislike the term. It’s as imprecise as any other term describing a “musical genre” (and as misleading as some of my pet peeves). But I’m not against its use, especially since there is a significant degree of agreement about several of the musicians I mention being classified (at least originally) as “Fusion.” Problem is, the term has also been associated with an attitude toward music which isn’t that conducive to thoughtful discussion. In some ways, “Fusion” is used for dismissal more than as a way to discuss musical similarities.

Still, there are musical features that I appreciate in a number of Jazz Fusion performances, some of which are found in some combination through the playing of several of the musicians I’m mentioning here.

Some things like the interactions between the bass and other instruments, some lyrical basslines, the fact that melodic lines may be doubled by the bass… Basically, much of it has to do with the bass. And, in Jazz, the bass is often key. As Darcey Leigh said to Dale Turner (Lonette McKee and Dexter Gordon’s characters in ‘Round Midnight):

You’re the one who taught me to listen to the bass instead of the drums

Actually, there might be a key point about the way yours truly listens to bass players. Even though I’m something of a “frustrated bassist” (but happy saxophonist), I probably have a limited understanding of bass playing. To me, there’s a large variety of styles of bass playing, of course, but several players seem to sound a bit like one another. It’s not really a full classification that I have in my mind but I can’t help but hear similarities between bass performers. Like clusters.

Sometimes, these links may go outside of the music domain, strictly speaking.  For instance, three of my favourite bassists are from Cameroon: Guy Langue, Richard Bona, and Étienne Mbappe. Not that I heard these musicians together: I noticed Mbappe as a member of ONJ in 1989, I first heard Bona as part of the Zawinul syndicate in 1997, and I’ve been playing with Langue for a number of years (mostly with Madou Diarra & Dakan). Further, as I’m discovering British/Nigerian bass player Michael Olatuja, I get to extend what I hear as the Cameroonian connection to parts of West African music that I know a bit more about. Of course, I might be imagining things. But my imagination goes in certain directions.

Something similar happens to me with “Fusion” players. Alain Caron is known for his fretless bass sound and virtuosic playing, but it’s not really about that, I don’t think. It’s something about the way the bass is embedded in the rest of the band, with something of a Jazz/Rock element but also more connected to lyricism, complex melodic lines, and relatively “clean” playing. The last one may relate, somehow, to the Fusion stereotype of coldness and machine-like precision. But my broad impression of what I might call “Fusion bass” actually involves quite a bit of warmth. And humanness.

Going back to Chourot and other “Jazz Fusion” acts I’ve been thinking about, it’s quite possible that Gilles Deslauriers (who plays bass on Chourot’s First Landing) is the one who reminds me of other Fusion acts. No idea if Bob Laredo (Jazzorange), Michel Alibo (Sixun), Alain Caron (Uzeb), and Gilles Deslauriers really all have something in common. But my own subjective assessment of bass playing connects them in a special way.

The most important point, to me, is that even if this connection is idiosyncratic, it still helps me enjoy First Landing.

Nicolas Chourot and his friends from that album (including Gilles Deslauriers) are playing at O Patro Výš, next Saturday (January 23, 2010).


Homeroasting and Coffee Geekness

I’m a coffee geek. By which I mean that I have a geeky attitude to coffee. I’m passionate about the crafts and arts of coffee making, I seek coffee-related knowledge wherever I can find it, I can talk about coffee until people’s eyes glaze over (which happens more quickly than I’d guess possible), and I even dream about coffee gadgets. I’m not a typical gadget freak, as far as geek culture goes, but coffee is one area where I may invest in some gadgetry.

Perhaps my most visible acts of coffee geekery came in the form of updates I posted through diverse platforms about my home coffee brewing experiences. Did it from February to July. These posts contained cryptic details about diverse measurements, including water temperature and index of refraction. It probably contributed to people’s awareness of my coffee geek identity, which itself has been the source of fun things like a friend bringing me back coffee from Ethiopia.

But I digress, a bit. This is both about coffee geekness in general and about homeroasting in particular.

See, I bought myself this Hearthware i-Roast 2 dedicated homeroasting device. And I’m dreaming about coffee again.

Been homeroasting since December 2002, at the time I moved to Moncton, New Brunswick and was lucky enough to get in touch with Terry Montague of Down Esst Coffee.

Though I had been wishing to homeroast for a while before that and had become an intense coffee-lover fifteen years prior to contacting him, Terry is the one who enabled me to start roasting green coffee beans at home. He procured me a popcorn popper, sourced me some quality green beans, gave me some advice. And off I was.

Homeroasting is remarkably easy. And it makes a huge difference in one’s appreciation of coffee. People in the coffee industry, especially baristas and professional roasters, tend to talk about the “channel” going from the farmer to the “consumer.” In some ways, homeroasting gets the coffee-lover a few steps closer to the farmer, both by eliminating a few intermediaries in the channel and by making coffee into much less of a commodity. Once you’ve spent some time smelling the fumes emanated by different coffee varietals and looking carefully at individual beans, you can’t help but get a deeper appreciation for the farmer’s and even the picker’s work. When you roast 150g or less at a time, every coffee bean seems much more valuable. Further, as you experiment with different beans and roast profiles, you get to experience coffee in all of its splendour.

A popcorn popper may sound like a crude way to roast coffee. And it might be. Naysayers may be right in their appraisal of poppers as a coffee roasting method. You’re restricted in different ways and it seems impossible to produce exquisite coffee. But having roasted with a popper for seven years, I can say that my poppers gave me some of my most memorable coffee experiences. Including some of the most pleasant ones, like this organic Sumatra from Theta Ridge Coffee that I roasted in my campus appartment at IUSB and brewed using my beloved Brikka.

Over the years, I’ve roasted a large variety of coffee beans. I typically buy a pound each of three or four varietals and experiment with them for a while.

Mostly because I’ve been moving around quite a bit, I’ve been buying green coffee beans from a rather large variety of places. I try to buy them locally, as much as possible (those beans have travelled far enough and I’ve had enough problems with courier companies). But I did participate in a few mail orders or got beans shipped to me for some reason or another. Sourcing green coffee beans has almost been part of my routine in those different places where I’ve been living since 2002: Moncton, Montreal, Fredericton, South Bend, Northampton, Brockton, Cambridge, and Austin. Off the top of my head, I’ve sourced beans from:

  1. Down East
  2. Toi, moi & café
  3. Brûlerie Saint-Denis
  4. Brûlerie des quatre vents
  5. Terra
  6. Theta Ridge
  7. Dean’s Beans
  8. Green Beanery
  9. Cuvée
  10. Fair Bean
  11. Sweet Maria’s
  12. Evergreen Coffee
  13. Mon café vert
  14. Café-Vrac
  15. Roastmasters
  16. Santropol

And probably a few other places, including this one place in Ethiopia where my friend Erin bought some.

So, over the years, I got beans from a rather large array of places and from a wide range of regional varietals.

I rapidly started blending freshly-roasted beans. Typically, I would start a blend by roasting three batches in a row. I would taste some as “single origin” (coffee made from a single bean varietal, usually from the same farm or estate), shortly after roasting. But, typically, I would mix my batches of freshly roasted coffee to produce a main blend. I would then add fresh batches after a few days to fine-tune the blend to satisfy my needs and enhance my “palate” (my ability to pick up different flavours and aromas).

Once the quantity of green beans in a particular bag would fall below an amount I can reasonably roast as a full batch (minimum around 100g), I would put those green beans in a pre-roast blend, typically in a specially-marked ziplock bag. Roasting this blend would usually be a way for me to add some complexity to my roasted blends.

And complexity I got. Lots of diverse flavours and aromas. Different things to “write home about.”

But I was obviously limited in what I could do with my poppers. The only real controls that I had in homeroasting, apart from blending, consisted in the bean quantity and roasting time. Ambient temperature was clearly a factor, but not one over which I was able to exercise much control. Especially since I frequently ended up roasting outside, so as to not incommodate people with fumes, noise, and chaff. The few homeroast batches which didn’t work probably failed because of low ambient temperature.

One reason I stuck with poppers for so long was that I had heard that dedicated roasters weren’t that durable. I’ve probably used three or four different hot air popcorn poppers, over the years. Eventually, they just stop working, when you use them for coffee beans. As I’d buy them at garage sales and Salvation Army stores for 3-4$, replacing them didn’t feel like such a financially difficult thing to do, though finding them could occasionally be a challenge. Money was also an issue. Though homeroasting was important for me, I wasn’t ready to pay around 200$ for an entry-level dedicated roaster. I was thinking about saving money for a Behmor 1600, which offers several advantages over other roasters. But I finally gave in and bought my i-Roast as a kind of holiday gift to myself.

One broad reason is that my financial situation has improved since I started a kind of partial professional reorientation (PPR). I have a blogpost in mind about this PPR, and I’ll probably write it soon. But this post isn’t about my PPR.

Although, the series of events which led to my purchase does relate to my PPR, somehow.

See, the beans I (indirectly) got from Roastmasters came from a friend who bought a Behmor to roast cocoa beans. The green coffee beans came with the roaster but my friend didn’t want to roast coffee in his brand new Behmor, to avoid the risk of coffee oils and flavours getting into his chocolate. My friend asked me to roast some of these beans for his housemates (he’s not that intensely into coffee, himself). When I went to drop some homeroasted coffee by the Station C co-working space where he spends some of his time, my friend was discussing a project with Duncan Moore, whom I had met a few times but with whom I had had few interactions. The three of us had what we considered a very fruitful yet very short conversation. Later on, I got to do a small but fun project with Duncan. And I decided to invest that money into coffee.

A homeroaster seemed like the most appropriate investment. The Behmor was still out of reach but the i-Roast seemed like a reasonable purchase. Especially if I could buy it used.

But I was also thinking about buying it new, as long as I could get it quickly. It took me several years to make a decision about this purchase but, once I made it, I wanted something as close to “instant gratification” as possible. In some ways, the i-Roast was my equivalent to Little Mrs Sommers‘s “pair of silk stockings.”

At the time, Mon café vert seemed like the only place where I could buy a new i-Roast. I tried several times to reach them to no avail. As I was in the Mile-End as I decided to make that purchase, I went to Caffè in Gamba, both to use the WiFi signal and to check if, by any chance, they might not have started selling roasters. They didn’t, of course, homeroasters isn’t mainstream enough. But, as I was there, I saw the Hario Ceramic Coffee Mill Skerton, a “hand-cranked” coffee grinder about which I had read some rather positive reviews.

For the past few years, I had been using a Bodum Antigua conical burr electric coffee grinder. This grinder was doing the job, but maybe because of “wear and tear,” it started taking a lot longer to grind a small amount of coffee. The grind took so long, at some points, that the grounds were warm to the touch and it seemed like the grinder’s motor was itself heating.

So I started dreaming about the Baratza Vario, a kind of prosumer electric grinder which seemed like the ideal machine for someone who uses diverse coffee making methods. The Vario is rather expensive and seemed like overkill, for my current coffee setup. But I was lusting over it and, yes, dreaming about it.

One day, maybe, I’ll be able to afford a Vario.

In the meantime, and more reasonably, I had been thinking about “Turkish-style mills.” A friend lent me a box-type manual mill at some point and I did find it produced a nice grind, but it wasn’t that convenient for me, partly because the coffee drops into a small drawer which rapidly gets full. A handmill seemed somehow more convenient and there are some generic models which are sold in different parts of the World, especially in the Arab World. So I got the impression that I might be able to find handmills locally and started looking for them all over the place, enquiring at diverse stores and asking friends who have used those mills in the past. Of course, they can be purchased online. But they end up being relatively expensive and my manual experience wasn’t so positive as to convince me to spend so much money on one.

The Skerton was another story. It was much more convenient than a box-type manual mill. And, at Gamba, it was inexpensive enough for me to purchase it on the spot. I don’t tend to do this very often so I did feel strange about such an impulse purchase. But I certainly don’t regret it.

Especially since it complements my other purchases.

So, going to the i-Roast.

Over the years, I had been looking for the i-Roast and Behmor at most of the obvious sites where one might buy used devices like these. eBay, Craig’s List, Kijiji… As a matter of fact, I had seen an i-Roast on one of these, but I was still hesitating. Not exactly sure why, but it probably had to do with the fact that these homeroasters aren’t necessarily that durable and I couldn’t see how old this particular i-Roast was.

I eventually called to find out, after taking my decision to get an i-Roast. Turns out that it’s still under warranty, is in great condition, and was being sold by a very interesting (and clearly trustworthy) alto singer who happens to sing with a friend of mine who is also a local beer homebrewer. The same day I bought the roaster, I went to the cocoa-roasting friend’s place and saw a Behmor for the first time. And I tasted some really nice homemade chocolate. And met other interesting people including a couple that I saw, again, while taking the bus after purchasing the roaster.

The series of coincidences in that whole situation impressed me in a sense of awe. Not out of some strange superstition or other folk belief. But different things are all neatly packaged in a way that most of my life isn’t. Nothing weird about this. The packaging is easy to explain and mostly comes from my own perception. The effect is still there that it all fits.

And the i-Roast 2 itself fits, too.

It’s clearly not the ultimate coffee geek’s ideal roaster. But I get the impression it could become so. In fact, one reason I hesitated to buy the i-Roast 2 is that I was wondering if Hearthware might be coming out with the i-Roast 3, in the not-so-distant future.

I’m guessing that Hearthware might be getting ready to release a new roaster. I’m using unreliable information, but it’s still an educated guess. So, apparently…

I could just imagine what the i-Roast 3 might be. As I’m likely to get, I have a number of crazy ideas.

One “killer feature” actually relates both to the differences between the i-Roast and i-Roast 2 as well as to the geek factor behind homeroasting: roast profiles as computer files. Yes, I know, it sounds crazy. And, somehow, it’s quite unlikely that Hearthware would add such a feature on an entry-level machine. But I seriously think it’d make the roaster much closer to a roasting geek’s ultimate machine.

For one thing, programming a roast profile on the i-Roast is notoriously awkward. Sure, you get used to it. But it’s clearly suboptimal. And one major improvement of the i-Roast 2 over the original i-Roast is that the original version didn’t maintain profiles if you unplugged it. The next step, in my mind, would be to have some way to transfer a profile from a computer to the roaster, say via a slot for SD cards or even a USB port.

What this would open isn’t only the convenience of saving profiles, but actually a way to share them with fellow homeroasters. Since a lot in geek culture has to do with sharing information, a neat effect could come out of shareable roast profiles. In fact, when I looked for example roast profiles, I found forum threads, guides, and incredibly elaborate experiments. Eventually, it might be possible to exchange roasting profiles relating to coffee beans from the same shipment and compare roasting. Given the well-known effects of getting a group of people using online tools to share information, this could greatly improve the state of homeroasting and even make it break out of the very small niche in which it currently sits.

Of course, there are many problems with that approach, including things as trivial as voltage differences as well as bigger issues such as noise levels:

But I’m still dreaming about such things.

In fact, I go a few steps further. A roaster which could somehow connect to a computer might also be used to track data about temperature and voltage. In my own experiments with the i-Roast 2, I’ve been logging temperatures at 15 second intervals along with information about roast profile, quantity of beans, etc. It may sound extreme but it already helped me achieve a result I wanted to achieve. And it’d be precisely the kind of information I would like to share with other homeroasters, eventually building a community of practice.

Nothing but geekness, of course. Shall the geek inherit the Earth?


Development and Quality: Reply to Agile Diary

Former WiZiQ product manager Vikrama Dhiman responded to one of my tweets with a full-blown blogpost, thereby giving support to Matt Mullenweg‘s point that microblogging goes hand-in-hand with “macroblogging.”

My tweet:

enjoys draft æsthetics yet wishes more developers would release stable products. / adopte certains produits trop rapidement.

Vikrama’s post:

Good Enough Software Does Not Mean Bad Software « Agile Diary, Agile Introduction, Agile Implementation.

My reply:

“To an engineer, good enough means perfect. With an artist, there’s no such thing as perfect.” (Alexander Calder)

Thanks a lot for your kind comments. I’m very happy that my tweet (and status update) triggered this.

A bit of context for my tweet (actually, a post from Ping.fm, meant as a status update, thereby giving support in favour of conscious duplication, «n’en déplaise aux partisans de l’action contre la duplication».)

I’ve been thinking about what I call the “draft æsthetics.” In fact, I did a podcast episode about it. My description of that episode was:

Sometimes, there is such a thing as “Good Enough.”

Though I didn’t emphasize the “sometimes” part in that podcast episode, it was an important part of what I wanted to say. In fact, my intention wasn’t to defend draft æsthetics but to note that there seems to be a tendency toward this æsthetic mode. I do situate myself within that mode in many things I do, but it really doesn’t mean that this mode should be the exclusive one used in any context.

That aforequoted tweet was thus a response to my podcast episode on draft æsthetics. “Yes, ‘good enough’ may work, sometimes. But it needs not be applied in all cases.”

As I often get into convoluted discussions with people who seem to think that I condone or defend a position because I take it for myself, the main thing I’d say there is that I’m not only a relativist but I cherish nuance. In other words, my tweet was a way to qualify the core statement I was talking about in my podcast episode (that “good enough” exists, at times). And that statement isn’t necessarily my own. I notice a pattern by which this statement seems to be held as accurate by people. I share that opinion, but it’s not a strongly held belief of mine.

Of course, I digress…

So, the tweet which motivated Vikrama had to do with my approach to “good enough.” In this case, I tend to think about writing but in view of Eric S. Raymond’s approach to “Release Early, Release Often” (RERO). So there is a connection to software development and geek culture. But I think of “good enough” in a broader sense.

Disclaimer: I am not a coder.

The Calder quote remained in my head, after it was mentioned by a colleague who had read it in a local newspaper. One reason it struck me is that I spend some time thinking about artists and engineers, especially in social terms. I spend some time hanging out with engineers but I tend to be more on the “artist” side of what I perceive to be an axis of attitudes found in some social contexts. I do get a fair deal of flack for some of my comments on this characterization and it should be clear that it isn’t meant to imply any evaluation of individuals. But, as a model, the artist and engineer distinction seems to work, for me. In a way, it seems more useful than the distinction between science and art.

An engineer friend with whom I discussed this kind of distinction was quick to point out that, to him, there’s no such thing as “good enough.” He was also quick to point out that engineers can be creative and so on. But the point isn’t to exclude engineers from artistic endeavours. It’s to describe differences in modes of thought, ways of knowing, approaches to reality. And the way these are perceived socially. We could do a simple exercise with terms like “troubleshooting” and “emotional” to be assigned to the two broad categories of “engineer” and “artist.” Chances are that clear patterns would emerge. Of course, many concepts are as important to both sides (“intelligence,” “innovation”…) and they may also be telling. But dichotomies have heuristic value.

Now, to go back to software development, the focus in Vikrama’s Agile Diary post…

What pushed me to post my status update and tweet is in fact related to software development. Contrary to what Vikrama presumes, it wasn’t about a Web application. And it wasn’t even about a single thing. But it did have to do with firmware development and with software documentation.

The first case is that of my Fonera 2.0n router. Bought it in early November and I wasn’t able to connect to its private signal using my iPod touch. I could connect to the router using the public signal, but that required frequent authentication, as annoying as with ISF. Since my iPod touch is my main WiFi device, this issue made my Fonera 2.0n experience rather frustrating.

Of course, I’ve been contacting Fon‘s tech support. As is often the case, that experience was itself quite frustrating. I was told to reset my touch’s network settings which forced me to reauthenticate my touch on a number of networks I access regularly and only solved the problem temporarily. The same tech support person (or, at least, somebody using the same name) had me repeat the same description several times in the same email message. Perhaps unsurprisingly, I was also told to use third-party software which had nothing to do with my issue. All in all, your typical tech support experience.

But my tweet wasn’t really about tech support. It was about the product. Thougb I find the overall concept behind the Fonera 2.0n router very interesting, its implementation seems to me to be lacking. In fact, it reminds me of several FLOSS development projects that I’ve been observing and, to an extent, benefitting from.

This is rapidly transforming into a rant I’ve had in my “to blog” list for a while about “thinking outside the geek box.” I’ll try to resist the temptation, for now. But I can mention a blog thread which has been on my mind, in terms of this issue.

Firefox 3 is Still a Memory Hog — The NeoSmart Files.

The blogpost refers to a situation in which, according to at least some users (including the blogpost’s author), Firefox uses up more memory than it should and becomes difficult to use. The thread has several comments providing support to statements about the relatively poor performance of Firefox on people’s systems, but it also has “contributions” from an obvious troll, who keeps assigning the problem on the users’ side.

The thing about this is that it’s representative of a tricky issue in the geek world, whereby developers and users are perceived as belonging to two sides of a type of “class struggle.” Within the geek niche, users are often dismissed as “lusers.” Tech support humour includes condescending jokes about “code 6”: “the problem is 6″ from the screen.” The aforementioned Eric S. Raymond wrote a rather popular guide to asking questions in geek circles which seems surprisingly unaware of social and cultural issues, especially from someone with an anthropological background. Following that guide, one should switch their mind to that of a very effective problem-solver (i.e., the engineer frame) to ask questions “the smart way.” Not only is the onus on users, but any failure to comply with these rules may be met with this air of intellectual superiority encoded in that guide. IOW, “Troubleshoot now, ask questions later.”

Of course, many users are “guilty” of all sorts of “crimes” having to do with not reading the documentation which comes with the product or with simply not thinking about the issue with sufficient depth before contacting tech support. And as the majority of the population is on the “user” side, the situation can be described as both a form of marginalization (geek culture comes from “nerd” labels) and a matter of elitism (geek culture as self-absorbed).

This does have something to do with my Fonera 2.0n. With it, I was caught in this dynamic whereby I had to switch to the “engineer frame” in order to solve my problem. I eventually did solve my Fonera authentication problem, using a workaround mentioned in a forum post about another issue (free registration required). Turns out, the “release candidate” version of my Fonera’s firmware does solve the issue. Of course, this new firmware may cause other forms of instability and installing it required a bit of digging. But it eventually worked.

The point is that, as released, the Fonera 2.0n router is a geek toy. It’s unpolished in many ways. It’s full of promise in terms of what it may make possible, but it failed to deliver in terms of what a router should do (route a signal). In this case, I don’t consider it to be a finished product. It’s not necessarily “unstable” in the strict sense that a software engineer might use the term. In fact, I hesitated between different terms to use instead of “stable,” in that tweet, and I’m not that happy with my final choice. The Fonera 2.0n isn’t unstable. But it’s akin to an alpha version released as a finished product. That’s something we see a lot of, these days.

The main other case which prompted me to send that tweet is “CivRev for iPhone,” a game that I’ve been playing on my iPod touch.

I’ve played with different games in the Civ franchise and I even used the FLOSS version on occasion. Not only is “Civilization” a geek classic, but it does connect with some anthropological issues (usually in a problematic view: Civ’s worldview lacks anthro’s insight). And it’s the kind of game that I can easily play while listening to podcasts (I subscribe to a number of th0se).

What’s wrong with that game? Actually, not much. I can’t even say that it’s unstable, unlike some other items in the App Store. But there’s a few things which aren’t optimal in terms of documentation. Not that it’s difficult to figure out how the game works. But the game is complex enough that some documentation is quite useful. Especially since it does change between one version of the game and another. Unfortunately, the online manual isn’t particularly helpful. Oh, sure, it probably contains all the information required. But it’s not available offline, isn’t optimized for the device it’s supposed to be used with, doesn’t contain proper links between sections, isn’t directly searchable, and isn’t particularly well-written. Not to mention that it seems to only be available in English even though the game itself is available in multiple languages (I play it in French).

Nothing tragic, of course. But coupled with my Fonera experience, it contributed to both a slight sense of frustration and this whole reflection about unfinished products.

Sure, it’s not much. But it’s “good enough” to get me started.


Groupthink in Action

An interesting situation which, I would argue, is representative of Groupthink.

As a brief summary of the situation: a subgroup within a larger group is discussing the possibility of changing the larger group’s structure. In that larger group, similar discussions have been quite frequent, in the past. In effect, the smaller group is moving toward enacting a decision based on perceived consensus as to “the way to go.”

No bad intention on anyone’s part and the situation is far from tragic. But my clear impression is that groupthink is involved. I belong to the larger group but I feel little vested interest in what might happen with it.

An important point about this situation is that the smaller group seems to be acting as if the decision had already been made, after careful consideration. Through the history of the larger group, prior discussions on the same topic have been frequent. Through these discussions, clear consensus has never been reached. At the same time, some options have been gaining some momentum in the recent past, mostly based (in my observation) on accumulated frustration with the status quo and some reflection on the effectiveness of activities done by subgroups within the larger group. Members of that larger group (including participants in the smaller group) are quite weary of rehashing the same issues and the “rallying cry” within the subgroup has to do with “moving on.” Within the smaller group, prior discussions are described as if they had been enough to explore all the options. Weariness through the group as a whole seems to create a sense of urgency even though the group as a whole could hardly be described as being involved in time-critical activities.

Nothing personal about anyone involved and it’s possible that I’m off on this one. Where some of those involved would probably disagree is in terms of the current stage in the decision making process (i.e., they may see themselves as having gone through the process of making the primary decision, the rest is a matter of detail). I actually feel strange talking about this situation because it may seem like I’m doing the group a disservice. The reason I think it isn’t the case is that I have already voiced my concerns about groupthink to those who are involved in the smaller group. The reason I feel the urge to blog about this situation is that, as a social scientist, I take it as my duty to look at issues such as group dynamics. Simply put, I started thinking about it as a kind of “case study.”

Yes, I’m a social science geek. And proud of it, too!

Thing is, I have a hard time not noticing a rather clear groupthink pattern. Especially when I think about a few points in Janis‘s description of groupthink.

.

Antecedent Conditions Symptoms Decisions Affected

.

Insulation of the group Illusion of invulnerability Incomplete survey of alternatives

.

High group cohesiveness Unquestioned belief in the inherent morality of the group Incomplete survey of objectives

.

Directive leadership Collective rationalization of group’s decisions Failure to examine risks of preferred choice

.

Lack of norms requiring methodical procedures Shared stereotypes of outgroup, particularly opponents Failure to re-appraise initially rejected alternatives

.

Homogeneity of members’ social background and ideology Self-censorship; members withhold criticisms Poor information search

.

High stress from external threats with low hope of a better solution than the one offered by the leader(s) Illusion of unanimity (see false consensus effect) Selective bias in processing information at hand (see also confirmation bias)

.

Direct pressure on dissenters to conform Failure to work out contingency plans

.

Self-appointed “mindguards” protect the group from negative information

.

A PDF version, with some key issues highlighted.

Point by point…

Observable

Antecedent Conditions of Groupthink

Insulation of the group

A small subgroup was created based on (relatively informal) prior expression of opinion in favour of some broad changes in the structure of the larger group.

Lack of norms requiring methodical procedures

Methodical procedures about assessing the situation are either put aside or explicitly rejected.
Those methodical procedures which are accepted have to do with implementing the group’s primary decision, not with the decision making process.

Symptoms Indicative of Groupthink

Illusion of unanimity (see false consensus effect)

Agreement is stated as a fact, possibly based on private conversations outside of the small group.

Direct pressure on dissenters to conform

A call to look at alternatives is constructed as a dissenting voice.
Pressure to conform is couched in terms of “moving on.”

Symptoms of Decisions Affected by Groupthink

Incomplete survey of alternatives

Apart from the status quo, no alternative has been discussed.
When one alternative model is proposed, it’s reduced to a “side” in opposition to the assessed consensus.

Incomplete survey of objectives

Broad objectives are assumed to be common, left undiscussed.
Discussion of objectives is pushed back as being irrelevant at this stage.

Failure to examine risks of preferred choice

Comments about possible risks (including the danger of affecting the dynamics of the existing broader group) are left undiscussed or dismissed as “par for the course.”

Failure to re-appraise initially rejected alternatives

Any alternative is conceived as having been tried in the past with the strong implication that it isn’t wort revisiting.

Poor information search

Information collected concerns ways to make sure that the primary option considered will work.

Failure to work out contingency plans

Comments about the possible failure of the plan, and effects on the wider group are met with “so be it.”

Less Obvious

Antecedent Conditions of Groupthink

High group cohesiveness

The smaller group is highly cohesive but so is the broader group.

Directive leadership

Several members of the smaller group are taking positions of leadership, but there’s no direct coercion from that leadership.

Positions of authority are assessed, in a subtle way, but this authority is somewhat indirect.

Homogeneity of members’ social background and ideology

As with cohesiveness, homogeneity of social background can be used to describe the broader group as well as the smaller one.

High stress from external threats with low hope of a better solution than the one offered by the leader(s)

External “threats” are mostly subtle but there’s a clear notion that the primary option considered may be met with some opposition by a proportion of the larger group.

Symptoms Indicative of Groupthink

Illusion of invulnerability

While “invulnerability” would be an exaggeration, there’s a clear sense that members of the smaller group have a strong position within the larger group.

Unquestioned belief in the inherent morality of the group

Discussions don’t necessarily have a moral undertone, but the smaller group’s goals seem self-evident in the context or, at least, not really worth careful discussion.

Collective rationalization of group’s decisions

Since attempts to discuss the group’s assumed consensus are labelled as coming from a dissenting voice, the group’s primary decision is reified through countering individual points made about this decision.

Shared stereotypes of outgroup, particularly opponents

The smaller group’s primary “outgroup” is in fact the broader group, described in rather simple terms, not a distinct group of people.
The assumption is that, within the larger group, positions about the core issue are already set.

Self-censorship; members withhold criticisms

Self-censorship is particularly hard to observe or assess but the group’s dynamics tends to construct criticism as “nitpicking,” making it difficult to share comments.

Self-appointed “mindguards” protect the group from negative information

As with leadership, the process of shielding the smaller group from negative information is mostly organic, not located in a single individual.
Because the smaller group is already set apart from the larger group, protection from external information is built into the system, to an extent.

Symptoms of Decisions Affected by Groupthink

Selective bias in processing information at hand (see also confirmation bias)

Information brought into the discussion is treated as either reinforcing the group’s alleged consensus or taken to be easy to counter.
Examples from cases showing clear similarities are dismissed (“we have no interest in knowing what others have done”) and distant cases are used to demonstrate that the approach is sound (“there are groups in other contexts which work, so we can use the same approach”).


Profils et web social

J’écrivais ce message à un ami, à propos de mon expérience sur le site xkcd.com.

 

What? Oh, no, the 'Enchanted' soundtrack was just playing because Pandora's algorithms are terrible. [silence] ... (quietly) That's how you knooooooow ...

BD de xkcd

C’est sur xkcd, mais ça pourrait être ailleurs. C’est rien de très spécial, mais ça me donne à penser à ce qu’est le vrai web social, en ce moment. Surtout si on sort de la niche geek.

Donc…

  • Je vois le dernier xkcd.
  • Ça me fait réagir.
  • Je veux répondre.
  • Je sais qu’il y a des forums pour accompagner ces bande dessinées.
  • Je vais sur le forum lié à celui-ci (déjà quelques clics et il fallait que je connaisse l’existence de tout ça).
  • J’appuie sur Post Reply
  • Ça me demande de m’identifier.
  • Comme je crois avoir déjà envoyé quelque-chose là, je me branche avec mon username habituel.
  • Ah, mauvais mdp.
  • Je fais “forget pw”.
  • Oups! J’avais pas de compte avec mon adresse gmail (faut que ça soit la bonne combinaison donc, si je me rappelle pas de mon username, ça marche pas).
  • Je me crée un nouveau profil.
  • Le captcha est illisible, ça me prend plusieurs tentatives.
  • Faut que j’aille sur mon compte gmail activer mon compte sur les forums xkcd.
  • Une fois que c’est fait, je me retrouve à la page d’accueil des forums (pas à la page où j’essaie d’envoyer ma réponse).
  • Je retrouve la page que je voulais.
  • J’appuie sur Post Reply.
  • J’écris ma réponse et je l’envoie.
  • Évidemment, mon profil est vierge.
  • Je vais modifier ça.
  • Ça commence par mon numéro ICQ?? Eh bé!
  • Plus bas, je vois des champs pour Website et Interests. Je remplis ça rapidement, en pensant au plus générique.
  • Il y a aussi ma date de fête. Pas moyen de contrôler qui la voit, etc. Je l’ajoute pas.
  • J’enregistre les autres modifications.
  • Et j’essaie de changer mon avatar.
  • Il y a pas de bouton pour uploader.
  • Ça passe par une Gallery, mais il y a rien dedans.
  • Je laisse tomber, même si je sais bien que les geeks de xkcd sont du genre à rire de toi si t’as un profil générique.
  • Je quitte le site un peu frustré, sans vraiment avoir l’impression que je vais pouvoir commencer une conversation là-dessus.

Deuxième scénario.

J’arrive sur un site qui supporte Disqus (par exemple Mashable).

  • Je peux envoyer un commentaire en tant que guest.

You are commenting as a Guest. Optional: Login below.

Donc, si je veux seulement laisser un commentaire anonyme, c’est tout ce que j’ai à faire. «Merci, bonsoir!»

Même sans me brancher, je peux faire des choses avec les commentaires déjà présents (Like, Reply).

Mais je peux aussi me brancher avec mes profils Disqus, Facebook (avec Facebook Connect), ou Twitter (avec OAuth). Dans chaque cas, si je suis déjà branché sur ce compte dans mon browser, j’ai juste à cliquer pour autoriser l’accès. Même si je suis pas déjà branché, je peux m’identifier directement sur chaque site.

Après l’identification, je reviens tout de suite à la page où j’étais. Mon avatar s’affiche mais je peux le changer. Je peux aussi changer mon username, mais il est déjà inscrit. Mon avatar et mon nom sont liés à un profil assez complet, qui inclut mes derniers commentaires sur des sites qui supportent Disqus.

Sur le site où je commente, il y a une petite boîte avec un résumé de mon profil qui inclut un décompte des commentaires, le nombre de commentaires que j’ai indiqué comme “likes” et des points que j’ai acquis.

Je peux envoyer mon commentaire sur Twitter et sur Facebook en même temps. Je peux décider de recevoir des notices par courriel ou de m’abonner au RSS. Je vois tout de suite quel compte j’utilise (Post as…) et je peux changer de compte si je veux (personnel et pro, par exemple). Une fois que j’envoie mon commentaire, les autres visiteurs du site peuvent voir plus d’infos sur moi en passant avec la souris au-dessus de mon avatar et ils peuvent cliquer et avoir un dialogue modal avec un résumé de mon compte. Ce résumé mène évidemment sur le profil complet. Depuis le profil complet, les gens peuvent suivre mes commentaires ou explorer divers aspects de ma vie en-ligne.

Suite à mon commentaire, les gens peuvent aussi me répondre directement, de façon anonyme ou identifiée.

J’ai donc un profil riche en deux clics, avec beaucoup de flexibilité. Il y a donc un contexte personnel à mon commentaire.

L’aspect social est intéressant. Mon commentaire est identifié par mon profil et je suis identifié par mes commentaires. D’ailleurs, la plupart des avatars sur Mashable sont des vraies photos (ou des avatars génériques) alors que sur le forum xkcd, c’est surtout des avatars «conceptuels».

Ce que xkcd propose est plus proche du “in-group”. Les initiés ont déjà leurs comptes. Ils sont “in the know”. Ils ont certaines habitudes. Leurs signatures sont reconnaissables. L’auteur de la bd connaît probablement leurs profils de ses «vrais fans». Ces gens peuvent citer à peu près tout ce qui a été envoyé sur le site. D’ailleurs, ils comprennent toutes les blagues de la bd, ils ont les références nécessaires pour savoir de quoi l’auteur parle, que ça soit de mathématiques ou de science-fiction. Ils sont les premiers à envoyer des commentaires parce qu’ils savent à quel moment une nouvelle bd est envoyée. En fait, aller regarder une bd xkcd, ça fait partie de leur routine. Ils sont morts de rire à l’idée que certains ne savent pas encore que les vraies blagues xkcd sont dans les alt-text. Ils se font des inside-jokes en tout genre et se connaissent entre eux.

En ce sens, ils forment une «communauté». C’est un groupe ouvert mais il y a plusieurs processus d’exclusion qui sont en action à tout moment. Pour être accepté dans ce genre de groupe, faut faire sa place.

 

Les sites qui utilisent Disqus ont une toute autre structure. N’importe qui peut commenter n’importe quoi, même de façon anonyme. Ceux qui ne sont pas anonymes utilisent un profil consolidé, qui dit «voici ma persona de web social» (s’ils en ont plusieurs, ils présentent le masque qu’ils veulent présenter). En envoyant un commentaire sur Mashable, par exemple, ils ne s’impliquent pas vraiment. Ils construisent surtout leurs identités, regroupent leurs idées sur divers sujets. Ça se rapproche malgré tout de la notion de self-branding qui préoccupe tant des gens comme Isabelle Lopez, même si les réactions sont fortes contre l’idée de “branding”, dans la sphère du web social montréalaisn (la YulMob). Les conversations entre utilisateurs peuvent avoir lieu à travers divers sites. «Ah oui, je me rappelle d’elle sur tel autre blogue, je la suis déjà sur Twitter…». Il n’y a pas d’allégeance spécifique au site.

Bien sûr, il peut bien y avoir des initiées sur un site particulier. Surtout si les gens commencent à se connaître et qu’ils répondent aux commentaires de l’un et de l’autre. En fait, il peut même y avoir une petite «cabale» qui décide de prendre possession des commentaires sur certains sites. Mais, contrairement à xkcd (ou 4chan!), ça se passe en plein jour, mis en évidence. C’est plus “mainstream”.

Ok, je divague peut-être un peu. Mais ça me remet dans le bain, avant de faire mes présentations Yul– et IdentityCamp.


Actively Reading: OLPC Critique

Critical thinking has been on my mind, recently. For one thing, I oriented an  “intro. to sociology” course I teach toward critical skills and methods. To me, it’s a very important part of university education, going much beyond media literacy.
And media literacy is something about which I care a great deal. Seems to me that several journalists have been giving up on trying to help the general population increase and enhance their own media literacy skills. It’s almost as if they were claiming they’re the only ones who can reach a significant level of media literacy. Of course, many of them seem unable to have a critical approach to their own work. I’m with Bourdieu on this one. And I make my problem with journalism known.
As a simple example, I couldn’t help but notice a number of problems with this CBC coverage of a new citizenship guidebook. My approach to this coverage is partly visible in short discussions I’ve had on Aardvark about bylines.
A bit over a week ago, I heard about something interesting related to “making technology work,” on WTP (a technology podcast for PRI/BBC/Discovery The World, a bit like Search Engine from bigger media outlets). It was a special forum discussion related to issues broader than simply finding the right tool for the right task. In fact, it sounded like it could become a broad discussion of issues and challenges going way beyond the troubleshooting/problem-solving approach favoured by some technology enthusiasts. Given my ethnographic background, my interest in geek culture, and my passion for social media, I thought I’d give it a try.
The first thing I noticed was a link to a critique of the OLPC project. I’ve personally been quite critical of that project, writing several blogposts about it. So I had to take a look.
And although I find the critical stance of this piece relatively useful (there was way too much groupthink with the original coverage of the OLPC), I couldn’t help but use my critical sense as I was reading this piece.
Which motivated me to do some Diigo annotations on it. For some reason, there are things that I wanted to highlight which aren’t working and I think I may have lost some annotations in the process. But the following is the result of a relatively simple reading of this piece. True to the draft aesthetics, I made no attempt to be thorough, clean, precise, or clear.
  • appealing
  • World Economic Forum
  • 50 percent of staff were being laid off and a major restructuring was under way
    • The dramatic version which sends the message: OLPC Inc. was in big trouble. (The fact that it’s allegedly a non-profit is relatively irrelevant.)
  • the project seems nearly dead in the water
    • A strong statement. Stronger than all those “beleaguered company” ones made about Apple in the mid90s before Jobs went back.
  • And that may be great news for children in the developing world.
    • Tadaa! Here’s the twist! The OLPC is dead, long live the Child!
  • lobbied national governments and international agencies
    • Right. The target was institutional. Kind of strange for a project which was billed as a way to get tools in the hands of individual children. And possibly one of the biggest downfalls of the project.
  • Negroponte and other techno-luminati
    • Oh, snap!
      It could sound relatively harmless an appellation. But the context and the piece’s tone make it sound like a rather deep insult.
  • Innovate
    • Ah, nice! Not “create” or “build.” But “innovate.” Which is something the project has been remarkably good at. It was able to achieve a number of engineering feats. Despite Negroponte’s repeated claims to the contrary, the OLPC project can be conceived as an engineering project. In fact, it’s probably the most efficient way to shed the most positive light on it. As an engineering project, it was rather successful. As an “education project” (as Negroponte kept calling it), it wasn’t that successful. In fact, it may have delayed a number of things which matter in terms of education.
  • take control of their education
    • Self-empowerment, at the individual level. In many ways, it sounds like a very Protestant ideal. And it’s clearly part of the neoliberal agenda (or the neoconservative one, actually). Yet it doesn’t sound strange at all. It sounds naturally good and pure.
  • technology optimists
    • Could be neutral in denotation but does connote a form of idealistic technological determinism.
  • Child
  • school attendance
    • “Children who aren’t in school can’t be learning anything, right?”
  • trending dramatically upward
    • Fascinating choice of words.
  • tens of millions of dollars
  • highly respected center
    • Formulas such as these are often a way to prevent any form of source criticism. Not sure Wikipedians would consider these “peacock terms,” but they don’t clearly represent a “neutral point of view.”
  • they don’t seem to be learning much
    • Nothing which can be measured with our tools, at least. Of course, nothing else matters. But still…
  • international science exam
    • Of course, these tend to be ideally suited for most learning contexts…
  • There’s no question that improving education in the developing world is necessary.
    • Although, there could be a question or two about this. Not politically expedient, perhaps. But still…
  • powerful argument
    • Tools in a rhetorical process.
  • instinctive appeal
    • Even the denotative sense is polarized.
  • precious little evidence
    • Switching to the “studies have shown” mode. In this mode, lack of proof is proof of lack, critical thinking is somewhat discouraged, and figures are significant by themselves.
  • circumstantial evidence
    • The jury isn’t out, on this one.
  • co-founder of J-PA
    • Did Esther co-write the article? Honest question.
  • the technology didn’t work any better than a normal classroom teacher
    • A very specific point. If the goal of tool use is to improve performance over “regular teaching,” it’s a particular view of technology. One which, itself, is going by the wayside. And which has been a large part of the OLPC worldview.
  • the goal is improving education for children in the developing world, there are plenty of better, and cheaper, alternatives.
    • A core belief, orienting the piece. Cost is central. The logic is one of “bang for the buck.”
  • the teachers simply weren’t using the computers
    • We’re touching on something, here. People have to actually use the computers for the “concept” to work. Funny that there’s rarely a lot of discussion on how that works. A specific version of “throwing money at a problem” is to “throw technology at” people.
  • few experimental studies to show a positive impact from the use of computers
    • Is the number of studies going one way or another the main issue, here? Can’t diverse studies look at different things and be understood as a way to describe a more complex reality than “technology is good and/or bad?”
  • substituting computers for teachers
    • Still oriented toward the “time to task” approach. But that’s good enough for cognitive science, which tends to be favourably viewed in educational fields.
  • supplement
    • Kept thinking about the well-known Hawthorne effect. In this case, the very idea that providing students with supplementary “care” can be seen as an obvious approach which is most often discussed in the field instead of at the higher levels of decision-making.
  • The OLPC concept has been pioneered in a number of school districts in the United States over the last decade
    • From a 2005 project targeting “countries with inconsistent power grids,” we get to a relatively long series of initiatives in individual school districts in the USofA since last century. Telescoping geographical and temporal scales. And, more importantly, assigning the exact same “concept” to diverse projects.
  • Negroponte has explicitly derided
    • Not the only thing Negroponte derides. He’s been a professional derider for a while, now.
      Negroponte’s personality is part of the subtext of any OLPC-related piece. It’d be interesting to analyse him in view of the “mercurial CEO” type which fascinates a number of people.
  • It must be said
    • Acknowledging the fact that there is more to the situation than what this piece is pushing.
  • academic
    • In this context, “academic” can have a variety of connotations, many of which are relatively negative.
  • teachers limited access to the computers
    • Typically, teachers have relatively little control in terms of students’ access to computers so it sounds likely that the phrase should have read “had limited access.” But, then again, maybe teachers in Hollow’s research were in fact limiting access to computers, which would be a very interesting point to bring and discuss. In fact, part of what is missing in many of those pieces about technology and learning is what access really implies. Typically, most discussions on the subject have to do with time spent alone with such a tool, hence the “one…per child” part of the OLPC approach. But it’s hard to tell if there has been any thought about the benefits of group access to tools or limited access to such tools.
      To go even further, there’s a broad critique of the OLPC approach, left unaddressed in this piece, about the emphasis on individual ownership of tools. In the US, it’s usually not ok for neighbours to ask about using others’ lawnmowers and ladders. It’s unsurprising that pushing individual ownership would seem logical to those who design projects from the US.
  • had not been adequately trained
    • In the OLPC context, it has been made as a case for the dark side of constructionism. The OLPC project might have been a learning project, but it wasn’t a teaching one. Some explicit comments from project members were doing little to dispel the notion that constructivism isn’t about getting rid of teachers. Even documentation for the OLPC XO contained precious little which could help teachers. Teachers weren’t the target audience. Children and governments were.
  • not silver bullets
    • Acknowledging, in an oblique way, that the situation is more complex.
  • surveys of students
    • With a clear Hawthorne effect.
  • parents rolling their eyes
    • Interesting appeal to parenting experience. Even more than teachers, they’re absent from many of these projects. Not a new pattern. Literacy projects often forget parents and the implications in terms of a generation gap. But what is perhaps more striking is that parents are also invisible in coverage of many of these issues. Contrary to “our” children, children in “those poor countries over there” are “ours to care for,” through development projects, adoptions, future immigration, etc.
  • evaluation of an OLPC project in Haiti
    • Sounds more like a pilot project than like field research. But maybe it’s more insightful.
  • Repeated calls and e-mails to OLPC and Negroponte seeking comment on OLPC did not receive a response
    • Such statements are “standard procedure” for journalists. But what is striking about this one is where it’s placed in the piece. Not only is it near the end of the argumentation but it’s in a series of comments about alternative views on the OLPC project. Whether or not it was done on purpose, the effect that we get is that there are two main voices, pro and con. Those on the con side can only have arguments in the same line of thought (about the project’s cost and “efficacy,” with possible comments about management). Those on the pro side are put in a defensive position.
      In such cases, responsiveness is often key. Though Negroponte has been an effective marketer of his pet project, the fact that he explicitly refuses to respond to criticisms and critiques makes for an even more constrained offense/defense game.
  • ironic
    • Strong words, in such a context. Because it’s not the situation which is ironic. It’s a lack of action in a very specific domain.
  • the Third World
    • Interesting that the antiquated “Third World” expression comes in two contexts: the alleged target of the OLPC project (with little discussion as to what was meant by that relationship) and as the J-PAL field of expertise.
  • a leader in
    • Peacock terms or J-PAL are on the Miller-McCune lovelist?
  • There are
    • This is where the piece switches. We’re not talking about the OLPC, anymore. We reduce OLPC to a single goal, which has allegedly not been met, and propose that there are better ways to achieve this goal. Easy and efficient technique, but there still seems to be something missing.
  • etting children in developing countries into school and helping them learn more while they are there
    • A more specific goal than it might seem, at first blush.
      For a very simple example: how about homeschooling?
  • proven successful
    • “We have proof!”
  • cheap
    • One might have expected “inexpensive,” here, instead of “cheap.” But, still, the emphasis is on cost.
  • deworming
    • Sounds a little bit surprising a switch from computer tech to public health.
  • 50 cents per child per year
  • $4 per student per year
  • 30 percent increase in lifetime earnings
  • technology-based approaches to improving student learning in the developing world
    • Coming back to technology, to an extent, but almost in passing. Technology, here, can still be a saviour. The issue would be to find the key technology to solve that one problem (student learning in the developing world needs calls for improvement). Rather limited in scope, depth, insight.
  • show more promise than one laptop per child
    • Perhaps the comment most directly related to opinions. “Showing promise” is closer to “instinctive appeal” but, in this case, it’s a positive. We don’t need to apply critical thinking to something which shows promise. It’s undeniably good. Right?
  • the J-PAL co-founder
    • There we are!
  • $2.20
  • Remedial education
  • A study in Kenya
    • Reference needed.
  • it didn’t matter
    • Sounds like a bold statement, as it’s not expressly linked to the scope of the study. It probably did matter. Just not in terms of what was measured. Mattering has to do with significance in general, not just with statistical significance.
  • expensive
    • Cost/benefits are apparently the only two “factors” to consider.
  • quarter of the cost
  • cheaper
  • $2 per month
  • $3 per month