Dispatch from the True North, Strong and Free

OTTAWA – A few words on my travels so far in Canada.

Last week I met some of the agents in Montréal’s thriving libre scene: LUGs, commercial free software developers and consultants, non-profits (they call them NGOs) and ad hoc collectives. For example, the city owes much of its wireless Internet to ÃŽle Sans Fil (literally, “Island Without Wiresâ€?) which work s with small businesses, etc. to dispatch Linux-equipped Linksys routers.

A few people mentioned to me that in the past few years, these various groups have come together in a higher cohesion, working more closely together on their common interests. That makes me happy – and it’s needed, if we’re to withstand external threats and advance our common cause. (Of course, that’s part of what Free Culture groups aim to do, on a campus level.)

On Sunday, I attended a presentation by Richard Stallman, president of the Free Software Foundation and founder of the GNU Project. From what I understood of the event (my one-semester background in French means I’m at a bit of a disadvantage) it was very interesting. RMS proposed a 10-year copyright term – certainly a radical change from our current state (and not the only one he proposed, I might add). A part of me wants to cheer at the idea: boy, that’d sure solve a lot of problems! But I also wonder whether it might cause as many, or more, problems than it’d solve.

Copyright, of course, is a slippery beast. In the U.S. tradition, we have the Constitution to guide us: “to promote the progress of science and useful arts”. The trick comes in defining what, exactly, progress looks like. Certainly, open access seems like progress – but what if open access reduces economic incentives to create, or to make art and information widely available? I’m not saying that it does, but it might – some people certainly worry so. How do we know if it does? And if so, to what extent? Should we care?

So we’re faced with a challenge – one that might be as great as the fight to be heard over the clamor of the megacorporations, or to organize and mobilize our allies. That challenge is two-fold: to prioritize the rights that we want have (for consumers, for technology companies, for students and educators and libraries, for subsequent creators), and to examine the effects of granting those rights – in not just anecdotal or speculative terms, but through empirical historical, economic, and sociological study.

In short: We know that there’s something wrong with copyright. We need to know how to fix it. Proposals like Stallman’s are great for revealing the realm of possibilities, for making us think outside the box, for prompting us to wonder, “What if?” But we should not simply throw our support behind an idea simply because RMS posits it, any more than we should accept flatly anything Lessig says, or anything Hilary Rosen says.

We can some threats, and some failings of the current system: thus the concern with Grokster and with orphan works. But we’re making a mistake, and miscomitting our priorities, if we let those issues consume all our time without fully moving into a deeper consideration of copyright. There are certainly no-brainers in the quest to fix copyright, but many of us – probably most of us at FreeCulture.org – feel they’re something more fundamental that’s broken with the current system. This has something to do with the fact that in the history of copyright, rights have grown damn near exclusively in one direction – and their overall growth, over the course of their existance in America, is rather stunning. At its heart, copyright is a contract between the creator and the user; writ large, between the creative production machine and everybody else, including those who want to build upon past works. The bargaining position of the “everybody else” in that equation has grown in leaps and bounds, in the forms of electricity, mimeographing, home computing – you get the idea – but the social contract of copyright has asked them to give up more and more of the abilities they would otherwise have. We smell a rat, and rightly so.

But smelling a rat and knowing how to get rid of it are two different things entirely – especially when the people doing the rat-smelling care about ideals like democracy, openness, the public interest, participation, and minimizing harm. The process is made even more deliberate when the rat-smellers are often professors, laywers, students, tinkerers, thinkers: folks who value academic inquiry and the scientific process over knee-jerk reactions and power politics.

If the first stage of the free culture movement (lowercase) was its birth and rapid growth, and the second stage is its networking among various interests and outside the traditional technology sphere, maybe the third will be – or should be – taking a cold, hard look at the subjects we care about, and put forward real alternatives, based in real study. Take Internet governance: as the ‘Net becomes less dominated by the U.S., and as every doohickey and thingamajig go online, and as companies want to give their own data priority on their network, and as hacks and vulnerabilities become multi-million dollar events, we’re looking at some serious questions about the basis of the Internet. Trusted computing and DRM are marching boldly (some might say cowardly) forward. The decision-makers need to have evidence that goes beyond anecdotes and gut reactions. That’s not to say our gut reactions are neccesarily wrong; they may be completely right. But if we’re telling the truth when we talk about participation, deliberation, and overcoming the politics of fearmongering and FUD, then it’s time to start number crunching. It’s time to start digging up the research that’s been done and analyzing, breaking it down, seeing what it really means, how it applies, what it implies, what it assumes, what it leaves unanswered, what needs research, what needs corroboration, what needs clarification. And then it’s time to do it.

Only then can we count the cost. But even knowing the cost, we still have to know our priorities. Okay, so a 10-year copyright term would (probably) have these effects: is it worth it? It may be that we value wildly different things; or consensus may come easily, who knows. But without a bed of empirical research, we cannot expect the American people, let alone the world, to even lay down and chat about what they dream, prize, and fear.

And when we’ve counted the cost, and explored what we value, then we can make some informed decision about whether we want the big car that drives slow or the small car that drives fast. Real change does not happen without something changing: if we patch all the minor bugs in copyright, it will still be an ugly mess, with plenty of questionable “features”. If we’re serious about moving forward to the next version – and I think those who care about the future of expression, technology, information, and media should be – then we need to consider well what the next version should look like.

That means we’ve got to grapple with many different definitions of things like “progress” and “freedom”, and do and read all sorts of research we’d probably prefer not to. But we’ve got to – or else things stay the way they are. If the problems run as deep as many of us suspect, that shouldn’t be an option.

Leave Yours +

2 Comments

  1. Nelson

    Careful what words you use… there is a group called the “progress and freedom foundation”, but their definition of freedom is control, as far as I can tell. They’re not really our friends, although they refer to “the free culture movement” frequently in order to bash us… so you’re probably going to want to avoid implying any connection between us and them.

    “I told you we should have trademarked the word ‘freedom’!”

  2. The event were Stallman spoke was called Copyright 2005 : Copyright and you, and was organized by FACIL, pour l’appropriation collective de l’informatique libre, Koumbit and the LabCMO. I think they deserve a little credit here ;)

  • Comments are Closed