Monthly Archives: March 2009

Lauren and her Laptop

Standard

For the most part I’m not that interested in the ad war between mac and PC. I think the mac ads are cute, mostly because John Hodgman is adorable. There’s lots of talk online right now about this ad, saying that “Lauren” is an actor, she never went into the mac store as she said she did, and the PC she got is a piece of crap, etc. Dishonest marketing? Of course! What marketing isn’t dishonest?

When I first saw the ad I went to see what computer she got, and I saw that it was 8lbs and laughed.

I personally don’t care about the mac/pc war because in general I think mac will continue to produce good products regardless, they’re making plenty of money to keep them in business, they’re still producing macbooks, which will be my computer of choice for the rest of the forseeable future. I like to love my laptops, and I love using macs. I generally think that mac is good as a niche; they aren’t going to produce crap computers for the cheap audience, because they don’t cater to the cheap audience. I don’t really want to see them change that priority just to get the greater market share. So as a mac user, I like them having a healthy share of the niche market. Seems perfect to me. So if PC wants to create a persona who “isn’t cool enough to be a mac person”, that’s cool. I mean, if “Lauren” wants to spend 25K on her car but won’t spend more than 1K on a computer, well, maybe she’s really not a mac person.

But in musing about it, the “regular person” technique, a few things are jumping out at me. She wants a cheap, 17-inch laptop. Why 17-inch? Clearly not for professional reasons; the 17-inch computer she got doesn’t have the juice to do any video editing or whatnot. For watching movies? It’s funny, because things are getting smaller these days. Most of the students at my campus have laptops, but the ones who got the bigger ones generally don’t want to lug them around. (And Lauren’s laptop is 8lbs…she might as well have gotten a desktop, really, for the amount she’ll be willing to drag it around.) The smaller laptops are getting more popular because of their sheer usability as portable machines. Netbooks are all the rage because of there incredible portability; we’re entering an era where we’re finally savvy enough about our needs to not always get the biggest and best “just in case”.

Maybe that’s why this ad makes me laugh. Lauren wasn’t trying to get the biggest and best, like we used to, trying to make the most of her investment. She just wanted the biggest, for the least amount of money. Why? This request just doesn’t resonate, particularly not in our current computing climate. Big laptops are increasingly a pain in the ass for everyone who owns one. Currently, the only people who appear to really want a big laptop are professionals who have particular kinds of work to do that requires a big screen and a modicum of portability for presentations. I’m a professional who wants lots of screen real estate; I have an external monitor at work on which I extend my desktop. I wouldn’t want a 17-inch laptop. It’s just not practical.

The only laptop I regularly move around these days is my beloved netbook, which gets online and plays all my favourite tv programs for me while I’m on planes, trains and automobiles. I can sit at the bar and check my email on my netbook, and still have room for my dinner and my beer. I get more comments on that netbook than I’ve ever gotten on all of my macs put together. People love the idea of a usable, small, cheap laptop. If you’re a coolhunter, you’re probably looking at small, fast and cheap. You can buy gigs of space on USB drives for peanuts these days; why spend hundreds for a big internal hard drive? Small hard drive, small physical computer, big RAM, bloody great OS (Ubuntu, anyone?) No one’s that excited about a big laptop running Vista, no matter how cheap it is.

Apple is often a bit a head of its time, sometimes painfully. They got rid of floppy drives well before it was a good idea (even I had to buy an external in the 90s). They took out the phone jack in the last few years too; that’s what pushed me to give my dad my old wireless router so I could still get online when I was visiting. They’re usually on the right track, but they pull the plug on things a tad too early. They keep you slightly uncomfortable with the things they declare as dead. But why is it that microsoft always seems to be, just as painfully, a step behind? Everyone else is talking about cheap, fast and small, and they give us an ad about cheap, slow and huge?

Ada Lovelace Day: Catspaw

Standard

It’s Ada Lovelace day, which is the day when we celebrate women in tech! This is an easy one for me: Michelle Levesque, otherwise know as Catspaw.

I first met Catspaw just before she started her first year at the University of Toronto. She was a scrappy teenager, with a history of sneaking into servers and testing their security without actually causing any damage. She learned code by writing it out by hand offline, and then testing it when she could get back online. She “hacked” her way into a private building in order to post up a giant-sized drawing of a stick-figure cat, just to show that hacking isn’t just a skill with code, it’s the skill of quietly finding and exploiting weak spots in security. She is scrappy, strong, intelligent, and incredibly gentle, thoughtful and considerate.

The most wonderful thing about Catsy is the way she thinks about technology. On the side of skills: she’s gifted. But that’s not what makes her special. It’s how she understands the role of technology in light of everything else. It’s the medium, not the message. Her interpersonal skills are excellent; she respects non-tech people, non-programmers, for the skills they bring to the table. She really listens to what people say. She absorbs ideas and really turns them around in her head. She sees all ideas as part and parcel of the project of changing the world. She will never, ever say to anyone: “You just don’t understand the tech.” She will instead improve the way she communicates so that you do. And she will work to make sure the tech understands you.

When she was offered a job at Google before she even finished her undergraduate degree, she wondered if she should even take it. In the same way she thinks about projects and code, she didn’t want to take the best-looking road first, in case there was another, better, more clean and sophisticated route to get there. She was still hacking her way through things, just like always.

What I admire most about Catsy is the way all the parts of her are merged into her work. She is not just a geekgirl. She is not just a programmer, just an engineer. She refuses to put her ideas or herself in a box; there is code in everything, and she won’t ignore the softer side of things because they don’t fit into the strict definition. This is why she’s able to be so much more than the sum of her parts; she merges them, she doesn’t deny them. She won’t fit the stereotype.

Catspaw is at the beginning of her career. I can’t wait to see what more she’s going to do. It will be amazing.

Emerging

Standard

So: new job title (“Emerging Technologies Librarian”). Definitely something that I wanted to see happen. I feel like it reflects what I actually do a lot better. Though I have pangs of regret when I think about instructional technology, but the lines are still blurry. Now I deliberately look at emerging technologies in teaching and learning, or maybe ones that haven’t quite emerged at all yet. Also emerging technologies as they apply to libraries in general, and our library in particular.

It’s exciting to have a job title that reflects what I’m already doing anyway, but it’s also kind of intimidating. I mean, keeping up with the trends was something I did as a bonus. Suddenly it’s in my job title.

So I was thinking about what trends I’m currently tracking, and I wonder how they fit into the whole “emerging” thing.

Second Life/Virtual Worlds. I’ve been on this one for a while, but I still think it’s emerging. Mostly because I think no one’s popularized the one true way to use virtual worlds in teaching and learning yet. In fact, there are so many wrong ways in practice currently that many people are getting turned off using Second Life in teaching. I’m still interested in it. I’m a builder, I’m interested in what you could use the environment for to build things and have students build things. A giant collaborative place filled with student-created expression of course content would be awesome. So I’m holding on to this one.

Twitter. I can’t believe I’m putting it on the list, but I am. Mostly because I’ve been talking about how great it is at a conference for some time now and I’m starting to see the argument come back to me from much larger places. People complain about what people twitter during events (“Too critical! Too snarky! The audience is the new keynote!”), but that’s pretty much exactly what would make things interesting in a classroom. I want to install the open source version and try it out with a willing instructor. I’m also interested in it for easy website updates, but most people would tell me that that’s a total misuse of the application. (Too bad!)

Ubiquitous Computing. I’ll say that instead of mobile devices. The hardware will come and go, but the concept of ubiquity for computing is fascinating. It’s coming in fits and starts; I want to see how I can push this one in small ways in the library. Computing without the computer. Ideally without a cell phone either. This is something I’m going to track for a good long while. I have this ubiquitous future in my head that seems like a perfect setting for a cyberpunk novel. (I might get around to writing it one of these days.)

Cheap Storage. As a rule hardware isn’t my area, but I’m interested to see what it means that storage capacity is getting so crazily cheap. If I can carry 120 gb in my pocket without even noticing it, what does that mean for computing in general?

Cloud Computing. This goes along with the cheap storage. Jeremy tells me we will never be affected by the cloud because we are a locked down environment for the most part, but I think he might be wrong. Even if we can’t fully employ the cloud because of security and legal limitations, I think the concept of cloud computing will sink into the consciousnesses of our users. We will need to be prepared to offer services as easily as the cloud can.

Netbooks. This fits in with cloud computing and cheap storage; if we can have tiny little computers with us at all times, massive amounts of physical storage and powerful applications coming down from the cloud, what does the world end up looking like?

Social Networks. Embracing the networks you have, on facebook, on IRC, on Twitter, on IM, wherever. Accepting that we are no longer a culture that uses its brain for information storage; we are processors, connectors. We store our knowledge in machines and in our networks. While social software may look like too much fun to be productive, those social networks are what’s going to scaffold us through most of the rest of our lives. Learning how to respectfully and usefully employ our networks as part of our learning (and teaching, for that matter) is an important skill.

There are some other pieces that are just never going to go away: blogging (for librarians!), wikis for everyone, IM: I think we’ve finally reached a point where we can intelligently choose the best tool for the task at hand from an incredible range of options. So I think part of the emerging trend is to use what’s best, not necessarily what’s most powerful, most expensive, or most popular. Things like twitter and netbooks are evidence of that: sometimes you don’t need all the bells and whistles.

So that’s my emerging update of the moment.

Best. Era. Ever.

Standard

I was thinking, while reading various articles about twitter, and interactive learning, and participatory culture, and fandoms, that I’m so glad I live when I do. I’m glad I was able to be around to see the birth of things like blogs and virtual worlds and all kinds of interactive applications of the internet. So much is still unformed, undefined; the blessing and curse of the early days of the social internet is that we get to do the defining. We don’t have buck a trend, we get to try out the new stuff and give them meaning to the wider culture. We get to be as imaginative as we can.

That’s so cool.

Wireless in the Classroom

Standard

My campus is planning the construction of a building dedicated to instruction; state of the art classroom technology, lots of computers, a space where a large class can take a monitored online test. There is, I’m told, a debate about whether or not to put wireless access into the building. Many instructors dislike the idea of students being online while they teach; “being online” means “not paying attention”, after all. The internet is fun and games, and learning is meant to be work.

No, that’s harsh, isn’t it.

Being online means chatting with your friends and goofing off. You shouldn’t be chatting with your friends and goofing off while you’re sitting in a lecture. It’s not respectful.

Except: what about people like me, who get so tied up in knots about the subject at hand that I need to spill my ideas out to SOMEone, SOMEwhere, and often use IM to channel my over-enthusiasm? (I think Jason took all my library school classes with me, virtually, through my constant stream of IMs.) What if that “chatting with friends” prevents someone like me from interrupting and turning your lecture into a one-on-one discussion? Or, what if the “chatting with friends” helps a student refine her critique? Or keeps her engaged, because otherwise her mind wanders and if reporting what she’s hearing about in the classroom to a trusted and interested friend helps her retain the knowledge better?

What if that trip to wikipedia, or google, helps clarify something? What if that internet activity is related to the process of learning the material?

Why does the instructor get to make the decisions about how students are going to learn?

Why are we more interested in optics than in allowing students to be adults and choose their own learning methods?

Why don’t we trust students?

Why do we not make use of the amazing resources available online while we’re teaching? Why not allow students to use virtual reference desks worldwide to get questions answered for the class, or check UN stats, or otherwise contribute meaningfully to the lecture? Why not harness the fact that students like to do something other than sit still in a room for three hours and ask students to go forage for elements that can enrich everyone’s learning experience? Why not be more interactive? Why not share not just expertise but a true love of seeking out information and turning it into knowledge? Why not just expect the learning part to happen after class, but in class as well?

Why not allow students to take notes collaboratively, on a wiki, or with Google notebook, or other, multi-cursor collaborative software?

Why not allow students to twitter their questions and ideas (you can easily monitor that)?

Why not give students a chance to react?

I’d like to throw together a video about why wifi in the classroom is a good thing. If you’ve got an opinion, go below and record me a little video describing your ideas, experience, anything. It doesn’t need to be long. I’ll mash them together into a video and upload them to YouTube. Please help!

Twitter and the Library

Standard

My latest all-consuming project is working to redesign/rework/completely renew our library’s website. It’s still early days, but there are certain lessons I’ve learned from my last all-consuming project (introducing coureware to the campus); you can never communicate too much. Even when you think you’re communicating enough, you probably aren’t.

From the worst days to the best days rolling out software to faculty and students, no one ever accused me of giving them too much information. While the internet is a very social medium, it can also be a very isolating one at the same time. When people are trying to get from point A to point B using some software that you answer for (even if you don’t control it), there’s really no way you can get too far into their personal space. They want to know that you’re there, that you’re anticipating their questions, that you’re aware of the problems they’re encountering. I never, ever want to go into downtime or unexpected system backfires without the ability to send out a message saying, “I feel your pain; here’s what I’m doing to help solve the problem. I’ll keep you in the loop.” It’s a lot easier to cope with problems online when you know someone somewhere is working on it.

And this is primarily where I have a problem with the static library website. The first page always stays the same; it’s generally got all the same information on it. This is good when you’re trying to teach people where to find stuff, if you think of your website as a static structure that should be learned. But it’s terrible if you consider your website your library’s (non-expressive) face.

I think there are two ways to think about a library website: it’s either a published document (heavily planned and edited before it’s published, published, then referred to), or it’s your communication tool. As a communication tool, it’s not published in the same way that books are published. It’s available, it’s public, it’s indexable, but it’s not static, it’s not finished. I kind of wonder if we should get rid of the term “publish” from these kinds of online tools. Sure, you put stuff online and it’s in wet cement (as Larry put it best), ie, likely to be around forever, but our concept of publishing suggests a kind of frozen quality, a finished quality. To me one of the best things about the web is our ability to leave nothing untouched. A communication tool, rather than a published document, should never look the same twice. It should always be telling you something new, informing you, reflecting the real people behind it.

So as we start laying down the foundations for a new library website, I keep thinking of ways to pierce it through with holes through which the real workings of the library, the real voices of the people who work there, can come through. I want students to get a sense that the library isn’t a solid object; it’s a place filled with people, people who work very hard to make things better for them, at that. People working to make sure the collections match the needs of their instructors and their course expectations, helping them with assignments, helping them find the resources they need, helping them use the software they need to use to succeed. I’d like to see if we can use social software to help make that work more transparent to students and faculty alike. Librarians do good work; everyone should see that work.

The first most obvious way I thought about making sure this transparency and easy communication was possible was through blogs. In my dreamworld, these long thought-pieces about technology and libraries would go on a library blog, not my personal one. But I’m not the only one thinking about things like collections blogs with discipline-specific categories, or reference blogs. Once this information is shared and online in an RSS-able format, we can shoot it in all kinds of useful directions. And then I started thinking about the things students know right now that they’d like to know: busted printers, software problems, unavailable computer labs, courseware downtime. How busy the library is. (Ours is more often packed to the gills than not.) The obvious things. We know about them before the students do: isn’t there some quick way we can tell them?

So then I got to thinking about twitter. Twitter for immediate messages. It doesn’t take up that much space, embedded on a page. And it keeps everyone to 140 characters. Like facebook status messages, but about the systems you’re trying to use. You can find out if they’re having a bad day or not before even trying to wrestle with them. I like it. Transparency, a little personality, a little humanness, and lots of communication.

We’ll see how it goes.

The Value of Networks

Standard

To say that networks are important is to state the blindingly obvious. Networks have always been important, from the medieval European confraternity to 20th century golf courses. Now, most people I know go to conferences partly because of the conference program, but mostly because of the extra-conference networking. The conversation that you have between speakers is often more valuable than whatever the speaker is saying. The best thing the speaker can do for you, as a conference attendee, is to provide the common ground, topic, and language to allow a networking conversation to open up around you; even a terrible speaker, one who says things with which everyone in the room vehemently disagrees, can have this effect.

Is this a radical statement? Not to say that there isn’t value in hearing about the status of someone’s research, but that’s what journal articles and even blog posts are for. I don’t go to a conference specifically to hear about that sort of content; there are cheaper means to do so. I go to meet you, to engage with you, and to hear about what others think about what you’re doing while you talk about it. I’m there to meet with others over the common ground of our interest in what you have to say. A gathering of like minds: I’m there to get the whole collection of ideas. This may be why unconferences and camps are gaining popularity; they, at least, at upfront about where the value in a conference lies. Sure, the speakers are important, but so are the conference-goers. Everyone has something to contribute, and there are many, many means of doing so.

I feel like we acknowledge the importance of networking, but do our best to pretend it’s not true at the same time. A very dichotomous relationship to the concept of the social network: networking is everything, but to speak its name is anathema. A colleague of mine at the library tells me that as a Computer Science undergraduate student, the word they used for cheating was “collaborating”. There’s lots being said about networked intelligence, but if someone is looking at facebook or using MSN or AIM at work, they’re being unproductive. (Not to say they’re de facto being productive just because they’re using a social networking tool; that’s unclear without more information.) Networking is supposed to be a quiet activity that you do on your own time. It’s too fun to be work.

I got to thinking about networks a lot lately when I quizzed a bunch of friends on a favourite IRC channel about the differences between various CMS platforms, and then arranged to bring in an old friend to consult with us via AIM. While many people feel they need to hide their networking efforts professionally, I’ve always opted to embrace mine, and I intend to do so even more in future. My network, constructed out of the people that I know whose knowledge and experience I trust, is smarter than I am. As with all information, I must evaluate what I get from my network, but I have the context available to do so; my friend at Google knows lots about web search, but not so much about Google docs, and while one of my friends in California is always on to the new thing, his quick dismissal of popular applications things means his predictions aren’t necessarily on target. This is angle on an old concept. Social networking applications give us the ability to dig for context on our contacts when using our networks to help us form opinions and make decisions; how we know these people, what do we know about them, where does their experience lie, and who do they know: these things can have an impact on the way we interpret information gleaned from them. You can actually be on facebook and not wasting your employer’s time! (Who knew!)

It’s a give and take, of course. I’m not just talking about quizzing my networks when I have a question (though I mean that as well). My networks give me things to think about all the time; they shape my thinking, point me in new directions, give me a sense of where things are moving. They show me where the trends are, what I should be paying attention to. The network imparts knowledge not only in the direct sense, but also through ongoing participation. We are a participatory culture online: web 2.0 is pretty well ingrained into us at this point. We talk back. It’s the talkback that turns around and alters my brain chemistry.

I’ve been cultivating my networks for years. Because my personal life and my professional life cross at so many points, it’s serendipity that my social network can be so valuable to me in a professional capacity. One of the most exciting things to discover is that an old connection from another community is bringing a new vision and new interpretation to my wider network.

In short: I’m starting to think seriously that it’s part of my professional responsibility to read twitter, my feed reader, Facebook, etc. in order to be shaped by my network.

Meanwhile, the professional speaker circuit doesn’t like this. Attending a talk, as I’ve outlined, has a two-fold impact: the obvious one, gaining insight from the speaker, and the hidden one, where I am further inspired, provoked, and shaped by my network in light of what the speaker is saying, and my intrepretation of it. That’s my true professional development, in the crossroads of all those things.

In probably 80-90% of most business and conference settings speakers have a message to give – at keynote speeches and large company events – the large audience venues. It is not a groupthink or collaboration.

If this is what the speakers of the world think, that most of the time we are there only to absorb their message without interpreting it and reinterpreting it on our own, that the bigger the event the more we should shut up and absorb, I’m afraid they’re talking out of both sides of their mouths and supporting an educational system that just doesn’t work.

In this post, Bert Decker suggests that it’s rude to IM during a conference, but presumably it’s not rude to jot down notes. In fact, isn’t jotting down notes the best sign in our culture that you’re paying attention? If you walk into a meeting with no paper to jot down notes on, people tend to presume that you don’t think anything of value is going on. This is considered unprofessional, and you will be pulled aside and given a talking to about it. Always come prepared; always carry something to write down notes on. That’s how you demonsrate respect! Rudeness is in the eye of the beholder on this one; if people are tweeting during your talk, perhaps you should take it as a compliment. They feel there is something in the talk to record and share with their network.

The way we create and feed our networks is to participate in them. We share our thoughts on the ideas that come to us; we build systems of thought and method based on the interplay between primary, secondary, and tertiary information. The tweets of the guy next to me during a big keynote is my secondary source, the thing that provides more voices and opinions to the information I’m gleaning.

Constant networking is impossible, and it’s important to know when it will help you and when it will distract you. But while most traditional folks like to take notes when they attend keynotes, I like my notes to talk back to me at the same time. If you’re not ready for the rich dialogue that it allows me to enter into with you based not only on my own experience and ideas, but also those of my network, I’m not sure you’re the right person to be giving that keynote in the first place.

My network is valuable; I bet yours is too.

Thick Tweets

Standard

Another follow-up to a tweet, posted in response to David Silver’s attempt to use a Geertzian theory on twitter:

http://tinyurl.com/bwxrac bizarre categorization of tweets. With a link, this is “thick”
2:45 PM Feb 25th

I appreciate someone trying to apply thick description to tweets, but I’m not certain David Silver hasn’t missed the mark a bit here.

First: isn’t it frustrating that every time we experiment with web applications, there’s someone somewhere trying to tell us how to do it right? Case in point, back from 2005: “I just spent fifteen minutes clicking through about 20 Xanga sites and I CAN’T FIND ANY BLOGGING GOING ON! Is it me?” (my response). We like these applications to fulfill a pedagogical role, often to improve the profile of the use of the application to other academics and departmental chairs. Current case in point: some researchers/educators using Second Life don’t want to be associated with the “debauchery” of the Second Life Community Conference, and want to break out on their own in order to set the “right” tone.

So now we get to the “right” and “wrong” kinds of tweets. This is a challenging thing, since a tweet is only 140 characters long. Silver encourages students to “pack multiple layers of information within 140 characters or less,” and those layers are defined by links, primarily. Also by shout outs. And mentioning names.

I don’t think thick description is a good way to evaluate a tweet. A tweet’s value isn’t in how much information it’s conveying, it’s in the basic value of the information itself. Personally I quite like funny tweets, regardless of whether they’ve got links in them or not. The context of tweets doesn’t come from the tweet itself, it comes from the environment around the tweet, the news of the day, the location of the user, the user’s other tweets, the user’s blog, flickr stream, employment history, and the live events the user it attending. Tweets are ultimately snippets that don’t necessarily make sense in isolation. I’d suggest that to evaluate them individually is to miss a great deal of their “thickness”.

Some of my favourite tweets:

“Great design comes from an artistic or cultural impulse, not a focus group.”
11:06 PM Jan 24th from web cloudforest

Is there anything more newbish than using Internet Explorer? Question still valid after 12+ years of asking it.
2:31 PM Feb 27th from TweetDeck, BeCircle

Overheard this week: “Lent? That’s the musical with all the AIDS, right?”
3:58 PM Feb 27th from TweetDeck, RJToronto

Still ignoring Facebook 25 things requests, but copying my wife’s idea: I’ll gladly go for coffee/beer and answer them in person instead.
4:03 AM Feb 27th from web, oninformation

These tweets don’t really fulfill Silver’s “thick” requirements, but I find them interesting and valuable. They give me things to think about. How do you quantify the pithiness of 140 characters?