OVER MANY YEARS I have built up a practice of taking audio recordings of talks and discussions at meetings, then editing and processing them into MP3 files that can be placed on the Web for streaming or downloading. It has always seemed a shame to me that so many organisations attract excellent speakers, whose lecture is listened to by a few score people at most, when it is so easy to use a little technology and skill to keep their words ‘in orbit round the globe’, for years and years after.

As a member of the Specialist Groups community of the British Computer Society, much of my recording work has been done for BCS groups and branches. I also regularly record at meetings of the International Society for Knowledge Organization (UK chapter), and in 2009 we replicated thus more than half the presentations at their first ISKO-UK international two-day conference.

One of my most recent audio-recording projects, which came out to the great satisfaction of all concerned, was a highly dynamic presentation on ‘Conquering the Imposter Syndrome’, given by executive coach Deena Gornick to a joint meeting of BCSWomen and Women in Technology. I edited together the choice bits of presentation with some interview segments between Deena and Women in Technology’s Maggie Berry, and you can now listen to the result from the WiT Web site. Incidentally, my ability to make this recording relied heavily on the technical capabilities of my new Marantz digital recorder, which is what this blog post is really about!

I also create explanatory and documentary videos, and a good quality audio recorder is very useful in preparing the narration track. Somewhat similar is a ‘slidecast’, Slideshare.net’s term for an audio narrative synchronised to a set of slides on their online hosting service. I wrote about my experience of making one of these a while back on this blog.

From tape and disk to solid-state

Olympus DS50 audio recorder

I’ve used a variety of tape and disk based audio recording devices in the last 15 years, including four MiniDisc machines, but in the last three years I have switched to devices with solid-state memory for storage. To transfer a 45-minute recording from MiniDisc or DAT tape to a computer for editing takes 45 minutes; but with these new-style devices, the recording is laid down in memory as a WAV or WMA or MP3 file, and is transferred to the computer in seconds, which is already an advantage.

I wrote about my ‘slidecasting’ experiment at a time when my main system was a well-designed but bulky and increasingly unreliable HHB MiniDisc recorder; my solid-state device, which I actually used for the slidecast, was a little handheld Olympus DS-50 voice recorder (a superior dictaphone, if you like).

HHB MiniDisc recorder with Røde studio mic in foreground and Sennheiser shotgun mic below.

The advantage of the HHB (when it worked) was its ability to take input from two independent external microphones, providing them with ‘phantom power’ if the mic technology needed it. As for the Olympus DS-50, it is a remarkably effective device for its size, and I always carry it about with me; but it can use only its own microphones, and because its internal memory is fixed at one gigabyte, it has to make use of a compressed audio data format (WMA) internally.

How things change! Recently, flash memory has become much cheaper, more capacious, and capable of faster data transfers. Compact digital cameras and camcorders now typically use one or other flavour of the Secure Digital memory card (‘SD card’) for storage, and today there is a crop of audio recorders which take the high-capacity version (SDHC). That is also true of the Olympus successors to my DS-50, such as the LS-10 which has a full-size SDHC slot. (If you fill up a card while on an assignment, you can always pop in another.)

Because SDHC storage is so capacious and speedy, many digital audio recorders now record uncompressed audio with a fidelity which in the past was associated only with DAT tape or hard disk recorders. The Olympus LS-10, for example, can record 24-bit WAV stereo files with a sampling frequency of 96 kHz.

A year with Zoom

Zoom H4

Zoom H4 is a popular digital recorder amongst musicians

In January 2009, I was on the look-out for a solid-state digital recorder which would be able to ‘inherit’ the full set of microphones from my now-abandoned HHB MiniDisc machine. I have two Røde large-diaphragm studio condenser microphones, which deliver outstanding clarity when properly mounted and positioned, plus a Sennheiser ‘short shotgun’ microphone which is extremely directional, and is my typical choice for video work. I also have a Sennheiser Evolution e85 handheld dynamic mic which is more suited to reportage situations.

I took advice from a Soho shop for musicians, which in retrospect was a mistake. They convinced me that the Zoom H4 recorder was a world-beating favourite. I bought one (but not from them; Studiospares is always cheaper!) and started to experiment.

During 2009 I made many fine recordings with the Zoom H4: for example, the ISKO-UK conference series referred to above. I have never used the Zoom’s own built-in microphones, always plugging my own mics in via XLR shielded cabling. (The XLR cable format has a third pin to carry phantom power out to the mic, and the shielding prevents picking up radio interference from the local taxi company or, as is more likely these days, someone’s mobile phone hunting for a signal.)

Zoom H4 shortcomings

It wasn’t long before I discovered four annoying shortcomings of the Zoom H4:

  • Feeble pre-amps. This is why asking a music shop for advice was such a bad idea. The Zoom H4 is a great favourite with bands who want a convenient small device with which to record their practice sessions, and those guys are LOUD. They don’t need the microphone signal to be boosted much, whereas someone recording meetings really needs substantially greater pre-amplification of the incoming signal.
  • Greedy for power. The Zoom H4 can be powered from a pair of AA batteries, but it drains them very quickly, especially if you use NiMH rechargeables. That might be OK for recording four-minute songs, but not for two-hour meetings.
  • Dreadful interface. I have used the Zoom H4 in lectures as an example of the difference between ‘functionality’ and ‘usability’. Its fiddly switches and jog dial, and the tiny LCD display with anaemic backlighting, make it a real pain for an older and somewhat long-sighted person like me to work with.
  • Not built for ‘the field’. There’s no way to attach a carrying strap to the Zoom H4, it doesn’t sit securely on a table, and the build is rather fragile.

Using an AudioBuddy pre-amplifier to compensate for the poor pre-amps in the Zoom H4 itself

I was able to solve the first two problems (for example when recording for ISKO) by plugging my microphones into a separate portable ‘Audio Buddy’ pre-amplifier, and powering each of these devices from their power packs, plugged into mains power. Then I would take line-level input from the Audio Buddy into the audio recorder.

This set-up has the added benefit that the Audio Buddy has a separate, easily adjusted gain control knob for each input channel. My ‘ideal’ mic set-up for a lecture is to put a quality Røde condenser mic near the speaker, on the ‘main’ channel, and keep the short shotgun mic beside me (suspended in a Rycote hand-held shockmount) on the second channel. I keep the gain on the secondary channel turned down until I need the shotgun mic to pick up questions from the floor, or to re-inforce the primary audio if the speaker wanders away from the microphone.

So, I had a workable solution. But it was annoying to find that the Zoom H4 was, in effect, only usable when I had access to mains power, and it also necessitated dragging extra gear along to each meeting where I was invited to record.

Twice or thrice as nice: Marantz PMD661

These experiences prompted me to do my research more carefully this time, and to pay particular credence to the opinions of field recordists and radio journalists. My choice at the beginning of this year was a Marantz PMD661 recorder, and my experiences with this machine have all been positive so far.

Marantz PMD661 recorder, belt-mounted

Marantz PMD661, suspended from my belt with an Arno strap. The cable from my dynamic interview mic is plugged into the ‘bottom’ of the device, and I am adjusting the gain knob.

Compared to the Zoom, the Marantz is more than twice as expensive, twice the volume, and perhaps three times the weight; but even so, that’s quite compact, smaller and much lighter than a digital SLR camera. The Marantz is built substantially, and the control surfaces are laid out ergonomically for two main modes of use: horizontally, with the device set on a table, or suspended vertically on a shoulder-strap through the strong metal connection loops. Latterly I’ve been keeping an Arno buckle-strap through one of the loops, and clipping that to a carabiner on my belt.

On the broader control surface (the ‘top’ in horizontal use mode), there is a large legible OLED multifunction display, and coloured LEDs signal the recording status and the sound levels of the two recording channels: all very useful when recording in low light levels. The Record, Record-Pause and Stop buttons are all comfortably thumb-sized. The narrower control surface, which is the ‘top’ when you have the Marantz at your side, carries the quarter-inch jack socket for headphone monitoring, the headphone volume control, and concentric recording-level adjustment dials for the two channels. You can operate these by feel alone.

Sound in and out

Whichever way you use the Marantz PMD661, when its interface is oriented towards you, the XLR sockets are at the side facing away from you. I can understand why the Zoom is configured the other way round, but it doesn’t make for convenience in the field.

In addition to the XLR inputs, which can be set as ‘line’ or ‘mic’ or ‘mic with phantom power’, there is also a competent pair of inbuilt mics, a ‘digital in’ port (electrical S/PDIF), and a secondary ‘line-in’ port for a 3.5 mm stereo microjack. I use this last when recording from my DAB radio.

And here is the all-important news: the microphone pre-amps in the Marantz are excellent! They are so powerful that I have been able to bring my voice-coil dynamic interview mic back into use. I’m also able to use my favourite Beyer DT-100 monitoring headphones: the Marantz headphone amp is strong enough to overcome their 100-ohm resistance.

Unlike the Zoom H4, the Marantz PMD661 can output through an inbuilt stereo pair of speakers. They’re not wonderful, but if you want to review an interview with your interviewee, they’re dead handy.

Power with intelligence

As for the power situation, the Marantz takes four AA cells, either alkaline or NiMH (nickel-metal hydride). I use the new-style low self-discharge NiMH type, rated at 2300 mAh, and I can get more than two hours’ continuous recording (with phantom power) out of them. With alkalines, I’m good to go for over five hours free of mains power.

In the settings menus, you can tell the Marantz which of these power options you are using – a feature I’ve not seen on any other electronic device. The importance of this is that NiMH cells start delivering power at 1.4V, but rapidly drop to 1.25V at 10% discharge, then maintain that voltage for most of the discharge period. Many devices (e.g. my DAB radio) interpret this low-voltage reading from NiHM cells as a ‘these batteries are flat’ signal, which ain’t true. Good thinking, Marantz!

To record a conference ‘properly’, I’d still lug my Røde mic and mic stands, use mains power and even include the Audio Buddy in the loop; but now with the Marantz I have the additional capability (a) to ‘go walkabout’ at an event, which proved necessary for the Deena Gornick recording and (b) to keep a lightweight recording kit on me at all times, such as illustrated in the photos, in case something comes my way ‘serendipitously’.

Two takes on time

Finally, I must touch on one very peculiar ‘feature’ of the Zoom H4. One one or two occasions I have had a need to record the video of an event, but also capture the audio on a separate audio recorder (for example, one occasion when my filming position was at the very back of a lecture theatre and I wasn’t able to run a long cable from there to where I wanted the microphone to be).

When I tried that with the Zoom H4 as my audio recorder, I found that its ‘sense of time’ drifted away from that of the video camera, making it impossible to synchronise the two recordings. Checking on the Web, I find that this is a well-known and much criticised feature of the Zoom H4. Incidentally there is now a new Zoom recorder, the H4a, which has vastly improved interface and control surface; but whether this particular glitch is fixed or not, I don’t know.

Three days ago I was filming at a meeting in a London hotel, and for a slightly different reason this time I was again recording the audio separately, this time to the Marantz. I’m pleased to say that I was able to synchronise the audio to the video perfectly this time, using the Marantz audio and throwing away the camera audio. Another thumbs-up for the PMD661!

At a recent social media networking meeting I did hear accusations of unreliability directed against the Marantz. Maybe I’ve just been lucky. Or maybe the critic was referring to the predecessor, the PMD660, which did draw a lot of flak, for example for the ‘noisiness’ of its pre-amp circuits. The PMD661 is a substantially redesigned machine drawing on the lessons learned from its predecessor, which I guess reinforces the message that you are generally better off not buying Version One of anything!

Advertisements
The graphic element of the BarcampAfrica UK logo

The graphic element of the BarcampAfrica UK logo

BarcampAfrica UK is a gathering of technologically inspired Africans and their friends, to be held in London on 7th November. Barcamps are a form of ‘unconference’, events in which the overall focus is declared, but a detailed agenda is only worked out on the day, and people are encouraged to come along prepared to take part in running a workshop. I’m lending my support to the organising team (thus far my main contribution was the logo design), and I plan to facilitate a workshop on the day.

The first Barcamp Africa was held last year in northern California, the brainchild of Ellen Petry Leanse and Kaushal Jhalla. There have been several since, one recently in Cameroon. Much of the impetus for the UK barcamp has come from young Africans in the BCS. The overall focus of the event will be on what technology (especially computing and communications technologies) can contribute to positive human, economic and environmental development in the African continent.

My Barcamp focus will be on information sharing/publishing. Information design, information sharing and electronic publishing are my personal areas of interest and knowledge. The workshop I intend to run at BarcampAfrica UK will focus on how to make it easier for African institutions to publish and disseminate useful information.

The high cost of publishing software

Ghanaian high school students in the National Museum, Accra. Photo Conrad Taylor

Ghanaian high school students in the National Museum, Accra. Photo Conrad Taylor

A few years ago, I spent a couple of weeks in Ghana. I went over with Ghanaian friends, but spent time wandering around Accra on my own. One of my visits was to Ghana’s National Museum, a small circular building that is not well known even to Ghanaians. If you want to get there by taxi, ask to be taken to ‘Edvy’s restaurant’: located within the museum grounds, it is better known than the museum itself!.

While browsing the exhibitions, I got talking to the museum’s curator, Joseph Gazari Seini, about methods for captioning exhibits; most of theirs were either typed, or hand-written. I explained the kind of work I do, and he invited me to his office.

Pulling out a folder, he showed me the text for a book, a report of some important archaeological work. It had been written on a computer, and the print-out was inkjetted. The folder also contained a collection of colour photographic prints showing aspects of the excavation and the objects found. Joe Gazari explained to me that they would love to get materials like this published, but they didn’t know how to go about it, and they hadn’t the means.

I wondered if there was some way I could help, and later that week I had a meeting with Joe’s boss. I’m afraid it ended up with misunderstanding; in retrospect I think they thought I was some rich oburoni offering to finance the whole venture. But what I did do when I got back to England was contact Adobe Systems Inc. Their software — FrameMaker or InDesign, Photoshop and Illustrator — would be excellent tools for book-publishing.

I understood that Adobe had some history of charitable giving; would they donate software to this cause? No. They do support projects, but only in their back yard, for example in San Jose, California. And the amount of money it would cost the museum in Accra to buy legitimate copies of professional publishing software would be way beyond their budgets.

Is free, open-source software the answer?

So this is one of the barriers: the cost of decent software. What about free, open source software? That is certainly a route worth exploring. There is OpenOffice Writer of course, but a word-processing program doesn’t have all the features you would expect of proper publishing software, such as flexible layouts, extensive support of graphics types, and CMYK colour for commercial printing.

There is also the Scribus desktop publishing program. Last time I tried it it was awful, but that was two years ago and maybe version 1.3.3.x is OK. I have to say that the page about installing on Mac OSX doesn’t inspire much confidence…

Another possibility would be to use the venerable typesetting language TeX, invented by Donald Knuth in the 1980s — free implementations are available for a variety of computer operating systems. On the down side, typesetting in TeX is not a WYSIWYG experience: you prepare a file using a text editor, embedding formatting codes, and feed the result to the TeX engine, which does a batch process on the file. The results can be of very good quality and the method has great promise for textbooks, but is difficult if not impossible to lay out publications in which there is an intimate relationship between text layout and graphics.

Fonts for African languages

Another issue that is a barrier for publishing in some of the African vernacular languages is that they require characters that cannot be found in the standard fonts. In early 2000, I conducted a personal investigation of this issue, writing a 55-page illustrated report called Typesetting African Languages, which I have now parked at The Internet Archive as a PDF. (Warning – it is now out of date, and may contain errors.)

In my report, I divided the languages into five grades of difficulty in respect of typesetting them:

  1. Languages which use the same characters as English, without any diacritics — examples include Swahili, Somali, Zulu, Xhosa. These langauges can even be processed using 7-bit ASCII or sent as telex without orthographic compromise!
  2. Languages which use accents but in a way that is similar to Western European languages like French or Portuguese, so can be typeset using standard computer fonts. Examples include Kikuyu and Tswana.
  3. Languages which use ‘ordinary’ letterforms but in unusual combinations, such as a dot under a vowel, or an accent over a consonant. These could be typeset with systems like TeX that build accented characters from components. Examples include the Nigerian languages Igbo, Edo and Yorùbá, the Zambian language Nyanja, and Wolof (Senegal).
  4. Languages which have additional letterforms that are not found in standard fonts — for example Fulfulde, Krio, Twi and Hausa.
  5. Languages which are written with non-latin script systems. Apart from the Arabic script, there is also the Tifinagh script of the Berber people, and the Ge’ez script of Ethiopia and Eritrea.
A passage from a text in Yorùbá, typeset using MacTeX and Computer Modern fonts. Click to enlarge.

A passage from a text in Yorùbá, typeset using MacTeX and Computer Modern fonts. Click to enlarge.

At the time that I wrote my study, the options for Level Three and Level Four languages looked quite problematic. I did have a go at typesetting a passage of Yorùbá using the MacTeX system and the Computer Modern type family. In this system, the accents and letters are composed by TeX macros, rather than stored as precomposed letterforms, so with a bit of hacking I was able to produce the result illustrated here

But a lot has happened since. For a start, more publishing software now encodes text in the Unicode system, which allocates a unique character identification number to each letter in all of the world’s languages. There are now fonts available in the extended TrueType and OpenType formats – fonts like Arial Unicode, Lucida Sans Unicode and Adobe’s ‘Pro’ font families – which contain hundreds or even thousands of characters, though usually not the ones which are needed for the African languages I have mentioned.

The growth in the number of ‘large glyph set’ fonts has in turn led to the development of mechanisms which make it easier to insert characters for which there is no keyboard keystroke. A good example is the Glyph Palette in Adobe InDesign, and the Character Palette system utility for Mac OSX.

A Twi (Akan) proverb typeset in two free-of-charge fonts, Gentium and SIL Charis

A Twi (Akan) proverb typeset in two free-of-charge fonts, Gentium and SIL Charis

What Africa really needs is fonts which contain the glyphs for African languages – and which are free of charge! Fortunately this is now beginning to happen. Gentium, a beautiful font, was designed by Victor Gaultney, originally I believe as part of a research project at The University of Reading. Victor started the drawings for Gentium (the name means of the nations in Latin) in the same year I was conducting my informal research. It now encompasses full extended latin, greek and cyrillic alphabets, is hosted by SIL International, and is available at no charge under the SIL Open Font License, which allows modification and redistribution.

SIL International is a Christian organisation with a 75-year history of documenting and supporting the world’s less well known languages. Originally a summer school (Summer Institute of Linguistics’), SIL has grown into an organisation with knowledge of 2,550 languages. It publishes The Ethnologue and has also created a number of fonts that supprt African languages well. Charis is a family of four fonts – roman, italic, bold and bold italic. It is based on Bitstream Charter, originally designed by Matthew Carter, which in 1992 was donated by the Bitstream foundry to the X Consortium.

It’s a good start, and there are other initiatives along these lines of which I am aware. A friend of mine, Dave Crossland, recently obtained his Masters at the University of Reading, in the course of which he created a font which he intends to release using an Open Font license. The other day, Dave showed me the font running on his Google Android ’phone, and very crisp and legible it looked too. Although Dave’s font doesn’t have the character support for, say, Twi and Yorùbá, the licensing provisions he has chosen make it possible for other type-wranglers to take the baton and run the next mile.

Even TeX doesn’t stand still. In the course of preparing this blog post I found a reference to a brand new version of TeX – XeTeX – which supports Unicode, OpenType and Apple Advanced Typography. It’s also being looked after by SIL, see here!

THIS WEEK, the British Computer Society launched its new identity, as part of what it calls the ‘transformation’ programme. In fact it now wants to be called only ‘the BCS’, along the lines of BAA and HSBC. But I predict that those of us who have been BCS members for some time will find it hard to drop the habitual mention of British, or Computer, or for that matter Society (BCS now wants to be called a ‘Chartered Institute’ instead).

On Wednesday, 23 September, the BCS held a Member Groups Convention at the Royal Society’s building in Carlton Terrace. The BCS is home to scores of local branches, some International Sections, plus more than fifty Specialist Groups. The ‘SGs’ are volunteer-run societies, embedded within the BCS, which cater to special interests such as Fortran, computer history and conservation, project management, health informatics, security, interaction design, information retrieval, artificial intelligence… the list is a long one. In addition, there is a very active Young Professionals Group.

The activities of the member groups form a vital part of the life of the BCS, and the Convention was held so that the elected and employed officers of the Society, I mean the Institute, could tell a collection of the organisers and activists from those member groups about the BCS’s new orientation and aspirations for the future. It also gave an opportunity for those of us who work as volunteers within the BCS member groups to think about how we can contribute to these goals.

Adrian Walmsley, BCS vice-president for member services, stands beside one of the two London cabs sporting BCS marketing messages.

Adrian Walmsley, BCS vice-president for member services, stands beside one of the two London cabs sporting BCS marketing messages.

The new visual identity of the BCS is a strong green colour, suffused with yellow glowing bits, and a half-shield logo device — as illustrated on one of two taxis currently carrying the BCS’s advertising messages on the streets of London. (Appropriately, the taxi shown has WiFi inside and a satellite data uplink!). There is also a new slogan to accompany the logo — Enabling the information society — and it is this new mission statement that I want to comment on here.

Of information and computers

As Alan Turing predicted, the conversion of data and information into numbers allowed glorified electronic adding machines to evolve into general-purpose information appliances. The Lyons catering company took a revolutionary step when it commissioned the creators of the EDSAC machine at Cambridge University to design a general-purpose business computer, which made its début in 1951 as the Lyons Electronic Office, or LEO computer. This was pressed into service collating the requirements of Lyons’ nationwide chain of tea-rooms, managing the inventory, and controlling the production and dispatch of baked products to the tea-shops.

Today, only the smallest enterprises are run without computers, and the management of data and information using computers is central to the operations of government and public services too. But computing is also essential to scientific research and technology development — and has itself become the object of scientific research and technology development.

I have no idea just when the term ‘Information Technology’ or ‘IT’ started to be used as a synonym for computing, but I suspect the reason was to ‘big up’ the image of computing and its practitioners to business bosses and politicians. Personally I do not like the term. Firstly, because there are many other uses for computers apart from in managing information, and calling all computing ‘IT’ ignores that. Secondly, many non-computing technologies have been employed through the centuries in connection with information, for example in the writing and printing of books.

Hence my reservations about the BCS’s move to suppress pronuncation of the C-word, while describing itself as the Chartered Institute for IT.

The information society

On the other hand, I feel very positive about the adoption by the BCS of ‘Enabling the information society’ as a statement of its mission; which I find both humble and ambitious at the same time.

Why humble? Because the slogan does not claim that computer people are the only heroes of the information society. There are, after all, other people who know a thing or two about how to deal with information: researchers, teachers, writers and editors, publishers, library and information science people.

Consider my craft practice as a writer, information designer, illustrator, photographer, typographer and media producer. I used to practice all these crafts without a computer, but without a doubt the Apple Macintosh, the PostScript language, digital cameras, all that lovely software, and of course the Internet, have been very enabling indeed.

The computer and communications industries, now thoroughly converged, are providing the world with the infrastructure and tools which information and media workers like me rely on to create, manage and disseminate information products, and various cultural creations too. And it’s not only about production: in the last two decades, what I prefer to call CCTs — computing and communications technologies — have also become the means by which ordinary people access information, enjoy cultural media creations like films and music, and communicate with each other. Thus CCTs, and the people who know how to create them and make them work, are essential enablers to all these practices.

It’s true, ‘information society’ is more of a sloganeering term than a clear definition. There is some reasonably authoritative precedent for it, however. In 2003 in Geneva and 2005 in Tunis, the United Nations held a two-part conference called WSIS, the World Summit for the Information Society. It was managed by UNESCO and the ITU, and I’m afraid the British Government didn’t take it very seriously, but at least I can say that the BCS Specialist Group community did.

Led by John Lindsay, then chair of the Developing Countries Specialist Group, a group of us held a workshop in January 2003 to consider the WSIS challenge and themes, and what might be the response of professional societies working within the information sphere. (Full account of the meeting as PDF, available at the Internet Archive.)

Certainly on the UNESCO side of the WSIS organisation, the concept of the Information Society carried with it a commitment to promoting wider access to the technical means of accessing and sharing information. In preparatory papers for WSIS, UNESCO also identified the need to promote what they called ‘information literacy’, the skills people need to be able to find the information they need, evaluate its worth and make critical use of it. And information literacy was the topic we focussed on in the January 2003 BCS-DCSG workshop.

The BCS as enabler

In explaining the importance of the new mission statement, BCS President Elisabeth Sparrow spoke to the Member Groups Convention of several ways in which the BCS can work towards enabling the information society.

For a start, society needs information-oriented computer systems that are fit for purpose, and one of the BCS’s principal aims is to promote professionalism among computing practitioners. The BCS has also committed itself to bridging the gap between education, practice and research. I would expect the member groups to contribute a lot to this bridging process, because they already provide a focus for collaboration between academics and practitioners.

The BCS is a charity and has a Royal Charter which commits it to working for the public good. One of the commitments described in the transformation launch document (also entitled Enabling the information society) is to ‘Informing public policy on how IT can contribute to society’ and another is to ‘Ensuring everyone benefits from IT’.

To be honest, in recent years the BCS and its member groups have had a difficult relationship; the groups have felt under-appreciated and over-controlled, and there is suspicion that there are some BCS staff who wish the groups would just go away and let the staff get on with running the whole show. But the mood at this week’s Member Groups Convention was much more positive, and with this new expression of the BCS’s role as helping to enable the information society, we have an agenda with which the member groups can engage with enthusiasm, and to which we have a lot to contribute.

This I think is particularly true for the Specialist Groups, many of which are engaged with external ‘communities of use’ for computer and communications technologies and information-handling systems (health informatics is a good example, and so is electronic publishing). In a way we act as a natural interface between the BCS and wider communities of information users.

One informal project that I have been involved in running is the discussion community KIDMM (on Knowledge, Information, Data and Metadata Management), which started as a collaboration between members of a dozen or so BCS SGs, held a founding workshop in March 2006 and now, three and a half years on, has held three other events and has eighty members of its discussion list. Most are BCS members, but many have joined us from other communities equally committed to enabling the information society.

Dave Snowden at the Gurteen Knowledge Cafe at the BT Tower in London: 17 February 2009.

Dave Snowden at the Gurteen Knowledge Cafe at the BT Tower in London: 17 February 2009. Photo Conrad Taylor.

33 YEARS AGO, I lived in a commune at Wick Court, a Jacobean manor house situated between Bath and Bristol, which was then the head office and conference centre of the Student Christian Movement. Another of the 17 or so communards was the SCM’s Chairman, Dave Snowden, a spare, bearded Welsh Marxist Catholic philosophy and physics graduate. Taking no prisoners in argument, and delighting in winding people up, Dave gave the impression of having slipped from another historical epoch, though whether it was from some future time, 1920s Russia or perhaps the Counter-Reformation was less easy to judge.

Our paths separated after Wick Court. Last summer I was again in the Bristol region, staying with my friend Bob Bater, who works in the area of knowledge management — a field I have strayed into from my lifelong interest in information design and communication. Bob told me about the work of a company called Cognitive Edge and its founder Dave Snowden, a description of whom sounded remarkably familiar. So when I had the opportunity to hear a lecture by Dave Snowden at a Gurteen Knowledge Cafe event at the BT Tower in London, I made an effort to go along.

Quite apart from the joy of re-union (and some grand views of London from the revolving restaurant at the top of the tower), it was also fascinating to discover how Dave’s thinking and mine have been oscillating around some of the same attractors over the years. Mind you, he has made a great deal more use of them than I have, through his many consultancies with major enterprises, and even state security organisations.

The topic for the day’s Cafe session was How can we best keep employees engaged in their work, in the current economic climate? But Dave’s talk addressed a much wider set of issues around what organisations should do to adapt to a complex and crisis-ridden world — one in which the ‘lessons’ of the past are no longer a guide for future action, and ‘management by objectives’ doesn’t equip us with the agility and intelligence to respond to emergent threats and opportunities.

Paradigm shifts in management thinking

The word paradigm has many sloppy uses, but Dave deploys it strictly to mean a theoretical framework: very close to the use established by Thomas Kuhn in his book on the sociology of scientific knowledge, The Structure of Scientific Revolutions. For a period of time, a paradigm dominates because it seems best to organise our thinking about the world; but then, a new approach that flies in the face of received wisdom and which is initially ridiculed gets taken up; and at a certain point, a ‘paradigm shift’ occurs and the former heresy becomes the dominant idea. The revolutions associated with Copernicus and Darwin are two such instances.

In the last 120 years, said Dave, two paradigms in succession have dominated management thinking. The first of these, dubbed Scientific Management (or ‘Taylorism’ after Frederick Winslow Taylor, its progenitor), focuses on the function of the business organisation and takes a command and control approach to the organisation of work. In this period, the emphases are on mass production, time and motion, automation and efficiency. The Machine is the model, and people are viewed as components of the system.

The second paradigm identified by Dave Snowden is Business Process Re-engineering, which took off in the 1990s. The belief in BPR is that one should look anew at the company’s objectives and define the outcomes or goals you want to achieve, the values or behaviours which you want your staff to adopt. Dave notes that this makes several assumptions, among them that you can pre-define your goals, and that one can rely on a form of causality in which results are repeatable, that the past is a guide to the future. The approach also relies on scalable and reliable technology (particularly information and communication technology) through which metrics can be gathered and control can be driven.

And now, said Dave, we are entering a third paradigm, which it was his job to explain. Rather than projecting an idealised future, this approach seeks to understand and manage the evolutionary potential of the present. It is people-focused, exploits mass collaboration and pervasive social computing, and relies on distributed rather than centralised cognition.

Dave also displayed for us a quote from Seneca:

The greatest loss of time is delay and expectation, which depend upon the future. We let go the present, which we have in our power, and look forward to that which depends upon chance, and so relinquish a certainty for an uncertainty.

Catastrophes, niches and predators

Dave argues that times of economic crisis also tend to coincide with paradigm shifts, the times when the previously dominant set of ideas appear weak and the opportunity arises for the new model to take hold. Focusing on evolutionary potential and emergent trends in the present is particularly beneficial in opening our attention to novel possibilities for ways of doing business.

An idea he borrows from evolutionary biology is ‘Dominant Predator Theory’. This holds that after a major disturbance of the ecology, there will be one predator species that will first make the move into the new niche, occupy it and dominate it. The speed with which the predator can move into this niche is key to its dominance; and agile companies will be the ones that emerge from the current crisis in best shape.

Systems and agents: three configurations

To understand how an organisation might exploit the third paradigm, Dave presented three system models: ordered, chaotic and complex. By ‘system’ he means networks that have coherence, though that need not imply sharp boundaries. ‘Agents’ are defined as anything which acts within a system. An agent could be an individual person, or a grouping; an idea can also be an agent, for example the myth-structures which largely determine how we make decisions within the communities and societies within which we live.

  • Ordered systems are ones in which the actions of agents are constrained by the system, making the behaviour of the agents predictable. Most management theory is predicated on this view of the organisation.
  • Chaotic systems are ones in which the agents are unconstrained and independent of each other. This is the domain of statistical analysis and probability. We have tended to assume that markets are chaotic; but this has been a simplistic view.
  • Complex systems are ones in which the agents are lightly constrained by the system, and through their mutual interactions with each other and with the system environment, the agents also modify the system. As a result, the system and its agents ‘co-evolve’. This, in fact, is a better model for understanding markets, and organisations.

Most people understand how to think about order and chaos, but understanding complexity requires a new mind-set. One property of complex dynamic systems is that patterns may take shape within them that can rapidly escalate if they find reinforcement. Many people know of the phrase ‘the butterfly effect’ to express how interactions between agents can sometime build rapidly into something that would have been hard to predict. The key to managing complex systems is therefore to attune to spotting emerging trends as early as you can, sometimes referred to as weak signal detection. If you have the means of detecting signals early, you can move to boost those that you view as positive, and dampen down those which are negative.

The childrens’ party metaphor

I don’t remember the younger Dave as a raconteur, but he’s certainly an accomplished one now. To explain the difference between the above three systems and how to manage them, he told a story about the rules around a birthday party for his 12-year-old son, and his 16-year-old daughter.

What would happen if you were prepared to model a 12-year-old’s party as a chaotic system? Anything could happen, including the house being reduced to ashes. What would an ‘ordered system’, ‘management by objectives’ approach look like? Dave span out a fantasy process starting with declaring the objectives for the party, pinning up motivational posters, writing a project plan with clear milestones at which progress would be measured against declared outcomes, starting the party with motivational video and Powerpoint, and after the party, reviewing the outcomes and updating the best-practice database. (That raised a laugh and ripple of applause.)

Far better to model the kids’ party as a complex system. You set boundaries (‘cross that and you’re dead!’) — but you keep them flexible, you see if you can stimulate the formation of ‘attractor mechanisms’ that will set up some good patterns, and you keep a weather-eye open for bad patterns that you will want to disrupt. Your aim is: to manage the emergence of beneficial coherence. And the benefits in business of adopting this process, Dave asserted, are that you get as dramatic an improvement as business process re-engineering ever got you, and at a low cost if you spot the patterns early and move appropriately.

Dave’s 16-year-old daughter’s party raised the stakes to a new level, as a previous incident involving a couple of friends and a bottle of vodka had given them cause to know. Here, the rules were more carefully set: who could be invited, what kind of alcohol etc. Shortly before the impending event, Dave was working with a well known client organisation in Langley, Va. USA, and sympathetic friends there loaned him an extensive array of surveillance equipment. With the house fully bugged, Dave and his wife stayed in their bedroom, tuned in, and only twice had to make forays so as to be on hand to disrupt possibly negative emergent patterns. ‘In a complex system, what counts is your weak signal detection,’ Dave explained.

Distributed cognition

Swindon’s Magic Roundabout

Swindon’s Magic Roundabout


Distributed Cognition is a key concept in Dave Snowden’ vision of the new way to manage radically networked organisations, and he explained it with reference to the so-called ‘Magic Roundabout’ in Swindon. I’ve been trying to get my head round it (according to Dave American visitors can’t get cars round it!), and my best way of conceptualising the roundabout is as a very compact ring road, with traffic in both directions, linked to five radial roads by a mini-roundabout at each junction.

The Magic Roundabout offloads onto drivers the decisions about how to interpret the traffic flows, when to go and to stop, which way to turn and so on. The roundabout has no traffic lights. That’s what distributed cognition is about: you rely on the intelligence of the agents in the system. Since it was introduced it has never jammed up.

When the Magic Roundabout was set up, the partitions within the roadways were set up on a temporary basis, movable on the basis of observations by policemen and road traffic engineers about how people reacted to them. And the moral that Dave drew from that was: ‘Move away from attempting failsafe design towards setting up safe-fail experimentation’. Because you cannot analyse the problem space fully in advance, and you have to be prepared to adjust systems interactively until you find that they work.

Putting distributed cognition to work

Within complex systems, for the organisation to become alerted to early weak signals of pattern formation, Dave argues that it is necessary to work with objects of fine granularity — be these objects organisational ones or informational ones.

As for the granularity of levels of human organisation at which distributed cognition works, we should also be aware of certain numerical thresholds: ‘Organisations work differently as 5 or less individuals, 15 or less, and 150 of less’. 150 is Dunbar’s number [Wikipedia ref], the limit on the number of ‘acquaints’ a normal person can maintain; it is also a typical kind of size for a military fighting unit such as a company, and W.L. Gore & Associates divide business units when they exceed this size. Dave described fifteen as a ’limit of trust’ and related to the size of the typical extended family; and five as Miller’ Number [Wikipedia ref], the limit of short-term memory. (Note: George Miller actually defined this limit as ‘the magical number seven, plus or minus two.’)

Spotting the ball, missing the beast

Dave then got us to take part in an experiment which demonstrated both the strengths of distributed cognition, and one of the failings of all human cognition about which we should be on our guard. He prepared us for this by getting us to think about the fairground competition that invites us to guess how many jelly beans are in a large jar. If the first guess is visibly posted up, subsequent guesses tend to bracket this; whereas if each person remains unaware of the other guesses, cognition remains distributed and the average of the guesses is more accurate.

Our task was to watch a video in which three students dressed in white T-shirts walked within a circle passing a basketball between them, while another three dressed in black T-shirts also circled between them, likewise passing a ball. How many times was the ball passed between the students in the white shirts?

This visual attention test wasn’t new to me, as I had read about Daniel Simons’ experiment in Scientific American a couple of years ago. Most people concentrating on this task fail completely to notice that half way through the clip (see it here), someone in a gorilla suit shambles in from the right, grins at the camera and beats his chest, then leaves. Many in this audience likewise missed the gorilla — but as a thumbs-up to distributed cognition, the reported number of passes neatly bracket the correct answer, which is 14. (I anticipated the gorilla, but failed to count three of the passes as a result!)

Given how we were prepped for the task, the gorilla was a typical ‘weak signal’ that slipped off most people’s radar. There is a moral to this: we don’t tend to notice what we are not looking for. According to Dave, people on the mild side of the autistic spectrum do tend to notice the gorilla despite concentrating on the task.

Stories and filters

With both positive and negative effects, human beings and societies are closely attuned to stories. Indeed a book by John D Niles characterises our species as Homo narrans. Families, workgroups, companies and whole cultures have stories, and dominant narratives act as primary filters through which we see (or fail to see) data. (Indeed, I see this as related to the idea of a ‘paradigm’ anyway.)

But what if we were able to capture stories and make use of them to build business intelligence? Dave’s clients are coming to value the freeform comment over conventional forms of survey, and also in preference to the output of focus groups (in which the danger of bias from the facilitator is high). At Cognitive Edge they have been developing methods and software tools which harness the illuminating power of stories beside distributed cognition, and make the raw data available directly to decision makers together with useful quantitative summaries and visualisations — what Dave calls disintermediation.

Dave gave several examples of this in practice, but one will suffice for here. In the My Visit project, National Museums Liverpool wanted to collect and analyse feedback from the hundreds of thousands of school children who visit and interact with its staff. The children leave small text comments, and are asked to ‘self-signify’ the comment fragments by placing them on a sliding scale between two negative polar opposites. Examples of scales:

  • From staff patronise the children

    to staff are too childlike and pathetic

  • From too much to see and it’s overwhelming

    to not enough to keep me interested

  • From rushed from place to place; missed things

    to too much time in one place

The software provided by Cognitive Edge (SenseMaker) presents the spectra of results as a histogram. Obviously, what the museum management would like to see would be positive stories that sit comfortably in the middle of these negative extremes. But to detect problems and fix them early, what they keep their eye on is the histogram columns at the negative edges. The long-term pattern displays as blue bars, the last 48 hours’ results show in red. Behind the simple display is a database of all the comment fragments, and the management can drill down quickly to read the individual, unmediated complaints.

The SenseMaker display lets museum staff monitor the emergence of trends and drill down to read comments left by children.

The SenseMaker display lets museum staff monitor the emergence of trends and drill down to read comments left by children. Image by permission of Dave Snowden.

Feedback from the system has helped the museum to refine and develop over twenty learning activities for children, and has quadrupled the figures for learning visits.

Dave showed other examples covering e.g. attitudes of employees to their leadership, scenario elicitation within a Canadian forestry service, and so on. One of the latest uses of SenseMaker sounds fascinating and I look forward to learning more: the Children of the World project aims to create a cultural map of the world, getting children to gather stories from their families that are reflections about past and present life, future hopes and aspirations. Starting in Liverpool, the project will branch out soon to Australia and Canada, Bangladesh and Africa.

Shifting paradigms

Summing up, Dave argued that while there is nothing wrong with the BPO paradigm for the things it’s good at, it doesn’t deal well with complexity. So what are the practical implications for organisations prepared to make the switch?

  • The demands of weak signal detection imply setting aside periodical analyses based on surveys and focus groups, in favour of narrative-based research that continuously captures and displays a disintermediated evidence base, as described above.
  • Rather than determining outcomes and measuring performance against targets (what’s ruining UK health and education), shift to measuring the impacts of activities and allow for emergence and adjustment.
  • Rather than using centralised scenario planning, set up systems which allow for scenarios to be generated by employees.
  • Break people out of their boxes. If you can assemble ‘crews’ with a membership that spans functional job boundaries and put the crews ‘on watch’, you will have at your service a team whose skills exceed those of any one of its members.
  • Best practice databases are all very well, but they lead you to rely on what you ‘learned’ in the past. Dave recommends knowledge management practices that collect narrative fragments.
  • ‘Practice-Informed Theory’ similarly assumes stability and doesn’t deal well with emergent behaviours and situations. Indeed Dave felt that more ‘Theory-Informed Practice’ is desirable. For example if there were more professional ethical principle applied in banking and accounting, would we be in the mess we’re in now?

ONE PLANET — the environmental documentary radio programme of BBC World Service — last month broadcast (and podcast) a documentary which explored whether a future solution to Europe’s electricity supply problems might come from giant solar-thermal generating stations in the Sahara Desert, feeding hundreds of megawatts into the European grid through undersea cables while also bringing ‘an industrial revolution to the Southern Mediterranean,’ to quote one of the scheme’s supporters, Prince Hassan of Jordan.

What the programme didn’t mention is that the idea is at least five years old and is championed by TREC, the Trans-Mediterranean Renewable Energy Co-operation project, an initiative of the German association for the Club of Rome, together with the Hamburg Climate Protection Foundation. But what Miriam O’Reilly’s inverview with Professor Galal Osman revealed is that there are people in Egypt who see the construction of the pioneering solar-thermal power plant at El Kureimat on the Nile as possibly the first step towards realising that dream.

El Kureimat, a few kilometres south of Minya, is the site of a power generation centre that already has two gas-powered generators installed. Now under construction is a ‘Concentrated Solar Power’ (CSP) solar-thermal plant which will generate 20 megawatts of power, rising to 50 MW. Similar in principle to solar-thermal plants operating in the deserts of south west USA, the plant will feature an array of mirrored fifty-metre parabolic reflectors covering 100,000m2. These will focus the sun’s rays onto a vacuum-insulated heat collector tube running down the centre of each reflector, through which water pumped under pressure will carry the heat to a central facility where steam is generated and electrical power is produced using turbines.

Sun, water and salt

Photovoltaics, using for example silicon cells, is a completely ‘dry’ process, witness its efficacy in outer space and the surface of Mars. But CSP solar-thermal plants are the affordable technology when scaling production up to tens and hundreds of megawatts. One factor that might be seen as a hitch to implementing ‘desert power’, then, could be shortage of water; but there are plenty of zones around the desert edge where water is to hand, though it may be sea-water.

One vision for an integrated system to harvest the desert and make it blossom involves using CSP plants not only to make electricity, but also to desalinate marine and brackish water, for human consumpion and for agriculture. It has also been pointed out that if the solar arrays are constructed so that the ground beneath them is accessible, those shaded spaces could be ideal for agricultural and horticultural use.

Apart from cost, solar-thermal has another advantage over photovoltaics, in situations where a reliable constant energy supply is desired. Electricity is expensive to store — but harvested heat can be stored cheaply for use in overnight power generation. The 50 MW solar-thermal plant which Spain commissioned at Solucar near Seville a couple of years ago uses underground vats of molten salt to store up to seven hours’ worth of power-generating heat.

High tension

In the programme, Professor Osman, fancifully invoking a memory of Ancient Egyptian worship of the sun, also imaginatively sketched a vision of ten thousand square kilometres of desert covered with CSP mirrors, generating potentially much of Europe’s electricity needs. How would this be brought across the Mediterranean? The TREC plan calls for three High Voltage Direct Current (HVDC) cables under the sea: one from Libya to Sicily, a second from Tunisia to Sardinia, and a third across the Straits of Gibraltar from Morocco to Spain.

In the early 20th century, in the competition between rival electricity transmission techniques, it was Alternating Current that won out. AC, which reverses polarity fifty or sixty times a second depending on the design of the system, can be stepped up to very high voltages with cheap transformer technology — and high voltage is important, as this vastly reduces the amount of power lost to electrical resistance in the transmission cables.

However, developments in semiconductor technology never stand still. In recent decades the advent of high-energy solid state static inverter circuits has made it simple to ramp voltages up and down for direct current, too. And this has real advantages: less electrical power is lost per thousand kilometres in DC cables, and the cables are cheapoer to make and lay. The problem is that DC static inverter stations are still much more expensive to build than AC transformer stations; but where energy must be moved long distances, especially underwater, HVDC is the sensible choice. (The longest current HVDC power circuit is overland, however — it’s the 1,700-km line that runs from Congo’s Inga Dam hydropower scheme to the copper mines at Shaba.)

HVDC has another benefit where international sales of electricity are contemplated. AC grids can exchange power only if synchronised in frequency and in phase. When AC circuits go out of step, they can bring the system crashing down, with wide-area power outages. DC power flows constantly in one direction and can easily be distributed to AC local circuits through an inverter regardless of local phase and frequency. Indeed, sometimes HVDC links are incorporated into power grids just for the stability they bring.

Energy and security

In the BBC programme, Prince Hassan spoke about the benefits which widespread adoption of solar-thermal power could bring to the Middle East and North Africa, especially in terms of prosperity and development, industry and agriculture — and flowing from this, greater social and political stability.

However, a gloomy note was stuck by Open University֦s Professor Dave Elliot, co-Director of the Energy and Environment Research Unit. He pointed out that almost insurmountable obstacles have arisen in negotiations between European nations about cross-border sharing of power; how much more difficult would it be to negotiate prices and access for power from another continent?

European policymakers are also understandably nervous about energy security; the way Russia plays politics with its gas pipelines illustrate the dangers of dependence. It is said that one of former Congolese president Mobutu Sese Seko’s motivations for supplying the Shaba copper mines with electricity from 1,700 km away, rather than develop allegedly cheaper local hydropower sources, was the advantage to him of having his hand on a big switch to render Katanga province powerless should it rebel. How stable is North Africa?

‘We’ve got to get the balance right in the future,’ concluded UK Energy Minister Malcolm Wickes, ‘between the energy we’ll have to import — mainly oil and gas and coal at the moment, maybe one day solar — and the energy we can produce here in Britain and just offshore. Energy security will become an increasingly important component of a nation’s security, given the huge global demand, the global grab for energy in the 21st century, and all the difficult geopolitics around that.’

TANTALUM is probably not one of the elements on the Periodic Table that you’ll remember from school chemistry. Dull grey and chemically inert, its principal use is for making electronic components, specifically tantalum capacitors which are capable of achieving high capacitance in a very small volume. Therefore tantalum, despite its scarcity, is in demand for use in mobile phones, PDAs and laptop computers.

Map of militias and minerals in eastern DRC

Map of militias and minerals in eastern DRC: click to enlarge

Tragically, the value of coltan is linked to the current round of bloody wars in the eastern provinces of the Democratic Republic of Congo. The columbite-tantalite ore from which both tantalum and niobium are extracted, colloquially known as coltan, is found in relatively easy-to-work surface deposits in the provinces of South Kivu (at Mwenga, Kamituga, Shabunda, Kalehe and Kabare) and North Kivu (at Masisi, Walikale and Lubera). There it is mined by ‘artisanal’ methods, by forced and extorted labour under the control of the various militias who are terrorizing the area, and exported to the world market to fund their acquisition of weapons.

Coltan is not the only one of Congo’s rich resources that are being pillaged in this way. An estimated US $70m worth of cassiterite (tin ore) is also reaching the world market from war-torn Kivu. Indeed, from late 2002 the price of coltan took a tumble after the US Defense Logistics Agency unloaded its tantalum stockpile onto the open market, but the price of tin went up after Japan and Europe enacted safety legislation to have lead-based solder replaced with tin solder for electronics fabrication.

In this morning’s (11 Nov) edition of Business Daily on the BBC World Service radio, the focus was on cassiterite. Global Witness, through its researcher Carina Tertsakian, has been keeping an eye on the exploitation of Congo’s resources for several years. In February 2007, Global Witness called on the UK Government to call to account a Wembley-based business, Afrimex, run by Ketan Kotecha, who has been buying coltan and cassiterite in the region of Goma since 1996, to the profit of the Goma faction of the RCD, the Rally for Congolese Democracy. The cassiterite is taken out through Uganda and Rwanda, and so to the Indian Ocean coast, from which it is transported to smelters in India and China, Thailand and Malaysia. There is also reportedly a cassiterite and tantalum smelting plant in the Rwandan town of Gisenji, just across the border from Goma, operated by a South African firm (MPA) whose founders have links to the Rwandan Patriotic Front party (see 2007 article by David Barouski).

Congo has many other minerals such as uranium, cobalt, diamonds and gold. The rival militias of the FNI and the UPC in the Ituri region across the border from Uganda are involved in organising the artisanal mining and smuggling of gold, and there have been documented dealings between the FNI and the multinational company AngloGold Ashanti around the operation of a mine at Mongbwalu.


THE MURDEROUS FACTIONALISM of eastern Congo, and the motivations of Congo’s neighbours in meddling in her misery, are difficult to understand: these are convoluted ethnic tensions with their roots in the colonial past, combined with opportunistic looting. Let me attempt an explanation, as I understand it.

How ethnic tensions crossed borders

The current fighting, in which the rebel general Laurent Nkunda is currently threatening to topple the Congolese government of Joseph Kabila, is just the latest round in a chain of Congo wars from 1996 that have cost the lives of millions. However, the roots of the Kivu and Ituri conflicts can be traced back even further, and over the border into Rwanda and Uganda.

After the First World War, the Belgians who controlled the Congo also took over the territories of Rwanda and Burundi from Germany under a League of Nations mandate. The Belgian policy was to strengthen aristocratic Tutsi dominance over the Hutu, strengthening the ethnic distinction between them, just as they similarly promoted the ethnic Hema over the Lembu in the Ituri region of North Congo.

(Note: the Tutsi and Hema do seem to share some genetic traits that suggest a different and perhaps more northerly origin, compared to the Hutu and Lembu — notably a lack of sickle-cell trait and high lactose tolerance. But with intermarriage and social mixing, the Tutsi commoners and Hutu seem to have been on a converging trend that Belgian policy artificially discouraged.)

In response to Tutsi dominance, the Hutu in Rwanda organised for their ‘emancipation’ around Gregoire Kayibanda and his PARMEHUTU party. Fighting broke out in November 1959, in which thousands of Tutsis were killed and many more fled into southern Uganda and into the South Kivu province of the Congo, where they became known as the Bunyamalengi.

The Ugandan governments of Idi Amin, and then Milton Obote, refused to allow the Tutsi refugee community to integrate in Ugandan society. However, when Yoweri Museveni launched his civil war against Obote from the south west of Uganda, the so-called ‘war in the bush’ (1981–86), he recruited a large number of Tutsi refugees into his National Resistance Army. Thus when Museveni came to power, and the NRA was rebadged as the Uganda People’s Defence Force (UPDF), it contained a large contingent of battle-trained Tutsi soldiers and officers, among them the current Rwanda president, Paul Kagame, who at that time was a head of military intelligence in Museveni’s army.

The native Ugandan officers may have resented the Rwandan Tutsi role in their army; in any case, the Tutsis’ aspirations lay elsewhere. They formed the Rwandan Patriotic Front in 1986, trained their army, and in 1990 launched an attack on Rwanda with Ugandan support. (Note: the French provided support for the Hutu régime during this fighting.) After 3 years of warfare, the Rwandan military dictatorship under the Hutu president Juvénal Habyarimana signed a cease-fire agreement with the RPF, the Arusha Accords, and the United Nations sent a security force, UNAMIR.

Assassination, genocide and aftermath

What happened next is a matter of much conjecture. Did Habyarimana intend to abide by the Accords, which would have ceded much power to the Tutsis? In any case, six months later, his jet plane was shot down by missile fire as it approached Kigali airport, killing all aboard. Controversity is still rife as to whether the assassination was the work of the RPF, the Ugandans, or an extreme faction among the Hutu leadership seeking to wreck the Accords.

Almost immediately, the Hutu-led army and the Interahamwe militia groups embarked on a well-organised campaign of genocide within Rwanda against the Tutsi and moderate Hutu: some 800,000 deaths in about one hundred days between April and July 1994. It is notable that the UN forces did not intervene to stop the massacre of Tutsis.

The RPF responded with an attack which occupied the north, east and south of Rwanda by June, and by July they had reached Kigali. Now France obtained UN Security Council support for a 3000-man military intervention, Opération Turquoise, which set up a protection zone in the west of the country through which two million Hutus, including the military and the Interahamwe killers, fled into the Congo (then known as Zaïre), principally to North Kivu.

Remnants of those Hutus who escaped to the Congo are now organised under the control of ex-army and ex-Interahamwe militia, as the ‘Democratic Forces for the Liberation of Rwanda’ or FDLR. Thus Rwanda, Uganda and Congo had fallen into a pattern of cross-border strife.

Rwanda and Uganda intervene in Congo

From 1994 to 1996, Paul Kagame’s new Tutsi-dominated Rwanda government took vengeance against the Hutus in Rwanda: for example, refugee camps were shelled with heavy artillery. In 1996, Kagame’s RPF forces invaded Zaïre, in part to seek out and destroy the Hutu forces there, but also to join an alliance with the Congolese rebel leader Laurent Kabila. Yoweri Museveni also threw the Ugandan UPDF forces into the civil war (the First Congo War) on the side of Kabila and Kagame.

The Rwandan army, the RPF, also armed and organised their Tutsi brothers who had been resident in Kivu since the ’sixties: the Banyamulenge. Acting together as the Alliance of Democratic Forces for the Liberation of Congo-Zaïre (ADFL), these forces first destroyed a series of Hutu refugee camps, then turned towards the capital, Kinshasa. Lacking popular support, the old and corrupt régime of Mobutu Sese Seko collapsed — and on 17 May 1997, Kabila declared himself president of the ‘Democratic Republic of Congo’.

‘Africa’s World War’

THE SECOND CONGO WAR broke out in August 1998. In the following five years, eight African nations and 25 or so armed groups would engage in a bloody struggle that claimed 5.4 million lives – mostly to disease and starvation. The conflict was triggered when Kabila, seeking to be rid of his erstwhile allies, ordered all Rwandan and Ugandan forces out of the country.

The Tutsi Banyamulenge in the town of Goma, feeling threatened, revolted; Rwanda responded with immediate military assistance, and a well-armed Tutsi rebel group was set up: the Rally for Congolese Democracy (RCD). Rwandan and Burundian forces entered the war, as did the Ugandan UPDF. Within a couple of weeks, these anti-Kabila forces had control of the diamond centre of Kisangani and the Inga hydropower station, and Kabila’s régime seemed doomed.

But Kabila was rescued because Namibia, Zimbabwe and Angola entered the war on the government’s side. (Both Zimbabwe’s Robert Mugabe and to a lesser extent Namibia’s San Njoma had significant mineral exploitation ventures in Congo.) Chad, Libya and Sudan also joined in on Kabila’s side, and financial support was forthcoming from US, Canadian, Australian and Japanese mining and diamond companies in exchange for concessions. Thus the Ugandan and Rwandan forces were held off.

One of the consequences for Kivu was that Joseph Kabila, who succeeded his assassinated father as DR Congo President, encouraged the unification of the ex-Interahamwe and other Hutu movements into the ‘Democratic Forces for the Liberation of Rwanda’, and set them to attack the RCD and the Rwandan Army (RPA) — a dangerous game. The FDLR continued its war against Rwanda and the Tutsis of the RCD after the ceasefire, and has done so up to the present day; though its strength is much reduced, this provides the pretext for Laurent Nkunda’s refusal to stand down — he claims to be protecting Kivu’s Tutsis — and for Rwanda’s intransigence too. There are many people who wish that the perpetrators of Rwanda’s 1994 genocide could simply be magicked away somewhere.

How Uganda, Rwanda & Zimbabwe looted the Congo

In 2005, Human Rights Watch published a report entitled The Curse of Gold. It documents how the Ugandan People’s Defence Force, having taken over the north of the Congo in 1998, immediately took direct control of gold-rich areas in Haut Uélé and coerced local people into mining gold (estimate: one ton, at $9 million) for their benefit. Ugandan Lt. Okumu initiated artisanal mining in the Durba and Gorumbwa mines, but accelerated it dangerously with explosives, resulting in the deaths of a hundred miners when the Gorumbwa mine collapsed.

The Ugandans were not alone in taking what they could from Congolese territory they occupied. In April 2001 a UN panel of experts investigated the illegal exploitation of diamonds, cobalt, coltan, gold and other resources by Rwandan, Ugandan and Zimbabwean forces and recommended that the Security Council impose sanctions. Their final report (Final report of the Panel of Experts on the Illegal Exploitation of Natural Resources and Other Forms of Wealth of the Democratic Republic of the Congo) contains many detailed allegations.

Proxies, militias & those who trade with them

Although the Rwandan and Ugandan armed forces withdrew from Congolese territory in 2002, they left behind their proxies. As a result of this, but also because of other disagreements and clashes, including open conflict that broke out in 2002 between Uganda and Rwanda, the situation became much more complicated.

  • In Ituri, Uganda backed an ethnic Lemba force, the Nationalist and Integrationist Front (FNI), which waged war with a Rwanda-backed ethnic Hema force, the Union of Congolese Patriots (UPC). In 2002–3, they were openly at war for control of the Mongbwalu goldfields. The UPC were assisted with air-drops of weapons from Rwanda; in 2003, some FNI attacks were directly assisted by Ugandan troops.
  • In Kivu, the RCD had split into a Kisangani-based faction (RCD-K, later RCD-ML) backed by Uganda, and a Goma-based faction backed by Rwanda (RCD-Goma). The latter appears to have control of part of the coltan and cassiterite production.
  • Though much of the RCD-Goma faction joined the newly integrated national army in 2003, Laurent Nkunda pulled his followers out and moved to the forests of North Kivu, forming the CNDP, which is now at war with the DRC and United Nations troops.
  • As already noted, the extremist Hutu forces of the FDLR are also still in play. For a while they were aligned to the DRC forces, but now act independently. They appear to control the bulk of the cassiterite mining around Walikale and Bisie in North Kivu.

A brutal symbiosis

All these militia armies maintain their power and re-stock their arsenals by exploiting the population and the environment. In the context of Congo’s mineral wealth, one of their best means is by organising forced artisanal mining, or controlling and exploiting it, usually through an informal tax. However, this makes economic sense only because the people who organise the trade locally have links to and protection from powerful people in Rwanda’s and Uganda’ military and political élites — people who are in a position to provide the means to transport and trade the ores out to the world market.

However, there is another tier of responsibility: the unscrupulous dealers, and the multinational companies who are prepared to buy gold and diamonds, casseritite and coltan without asking too many questions. And what responsibility are we then to ascribe, say, to the mobile phone manufacturers whose handsets may be built using ‘blood tantalum’?


Closing thoughts on commodity prices

Coltan prices on the world market have ratcheted up and down over time. One oft-quoted suggestion is that Sony’s desperation to acquire enough tantalum capacitors for the Sony Playstation 2 launch in 2000, and a spike in demand for mobile phones and DVD players, drove the price of coltan from $49 to $275 a pound in 2000. Professor Rev. Ferdinand Muhigirwa SJ of CEPAS, the Centre of Studies for Social Acstion in Kinshasa, cites prices per kilo of coltan as $400 in 1999, $320 in 2005. Thus high prices were the norm when Rwandan forces and their proxies were extracting coltan from the Congo. But now? Muhigirwa quotes a mere $32 per kilo of coltan for 2008. However, he believes that artisanal production in Congo will be continued ‘as a matter of survival’.

What are the lessons to be learned from the Obama campaign’s use of social networking software to mobilise supporters and raise $650 million?


THE SUCCESS OF BARACK OBAMA has been attracting world attention because he will be the first African-American, indeed the first non-white President of the United States of America. Some attribute his success to his opposition to the war in Iraq; others to the dreadful meltdown of US financial institutions that galvanised the American electorate in September. And many people to whom I have spoken since 4th November, and many public commentators besides, paradoxically regard Obama as a doomed man for those very reasons. It may be that his hands have been tied, his chalice poisoned. He will inherit two problematic and unpopular wars, a ten trillion dollar national debt, a growing energy crisis, an economy sliding into recession and, of course, his own commitments to cut taxes.

Many who voted for the largely unspecified ‘change’ promised by Obama may come to be disappointed. If he is forced to govern in financially conservative ways, which seems likely, the honeymoon may be a short one. Relations with the rest of the world may be just as fraught with dilemmas. If he can’t cut taxes and he can’t cut spending, he will have to resort to the third instrument: the printing press. Paying off your foreign creditors by printing dollars, provoking inflation and a drop in the value of the dollar so that those creditors lose half the value in real terms of what they loaned to you, is not a good way to make friends abroad. A cheaper dollar also hurts those countries which rely on the American market.

Obama’s amazing money machine

The my.barackobama.com Website

The my.barackobama.com Website

Probably the most amazing feature of the Barack Obama campaign is the way he was able to come from a starting position right outside of the Establishment, yet build a huge political machine, and raise a phenomenal war chest: an estimated total of $650 million, according to Richard Lister of BBC News. Which was a necessity, because Obama had to fight not one expensive campaign, but two. The Clinton camp in the Democratic Party machine had a virtual monopoly of all of the party’s big-money donors and fundraisers, and during 2007 had amassed about $100 million.

According to a fascinating article by Joshua Green in The Atlantic magazine dated June 2008, Obama’s initial financial support came from a hitherto untapped source: hi-tech Northern Californian Democrats who were software entrepreneurs and venture capitalists, adept at networking, and at ease with the Internet technologies that support it.

Mark Gorenberg, a partner in a San Francisco venture capital firm, got into political fundraising in 2003, to support John Kerry’s bid for the Democratic party nomination. He got friends and colleagues to commit not just to making a personal donation, but more importantly to commit to raising a certain amount from others. In the same year, Howard Dean’s supporters pioneered the use of the Internet to raise large numbers of small donations, such that before long he had out-fundraised both Kerry and Edwards. Dean’s team also pioneered the use of social networking sites for political organising, making use of sites like MeetUP.com to bring local activists together.

A few days before Obama declared his candidacy, there was a fund-raising dinner held for him in Northern California, hosted by John Roos and attended by Mark Gorenberg and Steve Spinner. It seems that the Clintons had overlooked the potential for Democrat financial support from the rich young Silicon Valley entrepreneurs, who in any case were readier to click with Obama. Gorenberg joined Obama’s national finance committee and was delighted to find that the campaign was ready to embrace new ideas about how to build networks and communities with online tools. Spinner also joined, and took the initiative to found an online affinity group, ‘Entrepreneurs for Obama’. Obama spoke to the network by videoconference, and soon it was raising big bucks.

My.BarackObama.com

But central to the success of the Obama campaign has been the site My.BarackObama.com, created along social networking lines. Indeed, Facebook co-founder Chris Hughes took a sabbatical from his company and came to Chicago to work on the campaign full time. Joe Rospars, a veteran of the Dean campaign who had in the meantime set up an Internet fundraising company, joined as head of new media for the Obama campaign.

The Atlantic quotes Rospars as explaining the rationale behind My.BarackObama thus:

We’ve tried to bring two principles to this campaign. One is lowering the barriers to entry and making it as easy as possible for folks who come to our Web site. The other is raising the expectation of what it means to be a supporter. It’s not enough to have a bumper sticker. We want you to give five dollars, make some calls, host an event.

People who signed up at My.BarackObama.com could register to vote through the site, and could set up their own personal affinity group with a listserv, to lobby their friends and associates. They could hit a ‘Make Calls’ button and get lists of phone numbers to call. They could set up their own personal fundraiser page, set a target with a ‘thermometer’ to display progress, and set to work raising money for the campaign. It was the Howard Dean approach, but with new tools and a tech-savvy team; the site attracted more than three million donors and fundraisers.

Spending their way across the nation

Accumulating huge amounts of money, much of it in ten- and twenty-dollar amounts, has been proved capable of outweighing the old model of political fundraising. The first sign of this came after Super Tuesday, when the Clinton campaign ran out of money. Meanwhile Obama’s campaign was rolling in cash. He went on spending on advertising, winning the next 11 primary contests.

When it came to the presidential election, McCain was hampered by having chosen to accept federal campaign funding: as a condition, he had a cap imposed on his spending. Traditionally, there are Democrat states and Republican states and relatively few swing states, and the parties focus their attention on campaigning in the swing states and to a lesser extent in their ‘safe’ states, ignoring the rest. But the Obama-Biden campaign had the money to put paid campaign workers into every one of the 50 states. They bought television advertising nationwide, too — culminating in the 30-minute advert that aired at eight in the evening on 29 October on NCS, CBS, Fox, Univision, MSNBC, BET and TV One, gaining an estimated 30.1 million viewers.

This, then, is to my mind the most interesting part of the Obama campaign: Politics meets Web 2.0, achieving unprecedented engagement and partipation, unprecedented fundraising, and an unprecedented nationwide media campaign, substantially extending the Democrats’ demographic, and advancing victory into states such as Virginia that had not voted Democrat for 40 years.

So, what next?

I started this post by noting what a sticky situation Obama is going to be in when he enters the White House in January. Is there anything he can do that will be different, unexpected? Might he make use of the 3-million-strong network of contacts and supporters that was built to get him to victory, or is it destined to fall away and burn up like an exhausted first-stage booster rocket, no longer needed?

I am intrigued by the text message which Obama sent out to his supporters, shortly before making his acclaimed Election Night speech:

I’m about to head to Grant Park to talk to everyone gathered there, but I wanted to write to you first.

We just made history. And I don’t want you to forget how we did it.

You made history every single day during this campaign — every day you knocked on doors, made a donation, or talked to your family, friends, and neighbors about why you believe it’s time for change.

I want to thank all of you who gave your time, talent, and passion to this campaign. We have a lot of work to do to get our country back on track, and I’ll be in touch soon about what comes next.

So — what comes next?


References

‘The Amazing Money Machine – How Silicon Valley made Barack Obama this year’s hottest start-up’ by Joshua Green
The Atlantic, June 2008.
(Online version)

‘ The Howard Dean Nominee’ by Steve Kornacki
The New York Observer, 26  June 2008.
(Online version)

‘Why Barack Obama won’ by Richard Lister, BBC News, Washington
BBC News online, 5 November 2008