Elements or Lower

Fri, 17 Dec 2004

Before I forget again, I’d like to boast that Woking have leapt from position 128 to 51 in the SiteMorse rankings for December. The monthly “league table” of Local Government sites currently ranks 460 authorities against each other on a range of criteria.

Although we’ve just moved over to a faster server and a better metadata system, I doubt that either of those have made the difference this month. The new server is responsible for the average page generation time going from 0.9 seconds to around 0.6 seconds. A third off is a bargain by anyone’s standards, but we’re still failing the download speed tests for both modem and ADSL connections. When I’m finally able to do it, the long-awaited CSS makeover of the site should cure that.

No, I strongly suspect that what made the difference was a clampdown on broken links. There’s not a vast amount that a (fried not baked) CMS can do about broken external links, so Woking have invested in a link-checking service to help us keep track of those.

It’s in the realm of internal links that the CMS can help. When creating a page, publishers can link to other resources using a dialog that encourages them to enter the link destination one way for an external link, another for links within the site, plus another for email addresses and — in a novel twist — a fourth for links to pages that haven’t actually been created yet. By recording the ID of internal pages that are being linked to rather than their published URL, the CMS can make sure that links from one resource to another are kept up-to-date even if the resource being linked to is moved from one part of the site to another. It also allows the system to construct the link according to the browsing circumstances, so if you’re in one version of the site, you stay there rather than being shunted to another version.

Given that, there remain two circumstances in which an internal link could become broken:

  1. The link is to a resource which is subsequently deleted from the site altogether. Even if the CMS emails notification to all publishers who have resources linking to the newly-deleted resource, there’s still inevitably a period of time when the page has a link leading nowhere.

  2. A resource in test links to another resource in test. The first resource is then made live before the second one. A page in test — that hasn’t been published to the live site at all yet — is invisible to the live site, and will consequently 404.

We came up with the following technique to help deal with this:

Consequently, from the perspective of the casual visitor to the site, there are no broken internal links any more — although there might be phrases that seem out-of-context without a link adorning them. If publishers are careful not to over-use click here, even that shouldn’t be too much of an issue.

Thu, 16 Dec 2004

The IP Address for the Soul

So I’m dreaming about the death penalty, for which I blame the missus and her Year 9 class entirely. In the dream, I’m responsible for the technical implementation of a new form of capital punishment.

The technique goes like this: all the major systems within the brain have a neurological equivalent of an IP address, and one only needs to perform a Slashdotting on a specific pair of those neuro-IP addresses to cause death.

The first is the IP address for the soul. The second is responsible for timing regulation in the brain, and much like one can comprehensively stiff a Mac by using cheap memory, overloading this causes the body to twitch and spasm for a few seconds before the execution is complete. The Denial-of-Service attack on the soul, of course, actually brings on instantaneous death; but the second stage is entirely necessary to avoid the body going into persistent vegetative state.

There are, I hardly need mention, a number of things that disturb me about this dream, not least of which is the fact that launching this most geeky of judicial murders still involved heaving down a dirty great big lever on a wall.

Mon, 13 Dec 2004

FWIW

I think Sarah from Girls Aloud looks like she might be Cylon Number Six’s younger sister.

Right then. That’s profundity sewn up for today: it’s shallow all the way from here on in.

Wed, 08 Dec 2004

Catching Up

So I’m crap and haven’t posted in a month. The troubling thing about such a scenario is that things occur to you that you’d like to quickly post about, but a quick post seems altogether inadequate after such a dry spell. Do a proper entry first, I tell myself. And, since the proper entry seems too tall an order, nothing happens.

Well, enough. There’s sure been progress on the CMS front, which calls for a big list to catch up:

  1. The Woking site has moved to a new server. Still at Rackspace, but rather beefier and now behind a firewall. While migrating the site from my server to theirs, I thought I’d run what I knew would be a slow MySQL query on each — the database itself had by this point been copied across. The log table contains a couple of million records, and conducting a simple query on an unindexed column took 1 minute, 30.56 seconds to come up with the answer on my server, and a measly 8.21 seconds on theirs. Wow.

  2. We’re now using the LGCL rather than the APLAWS category list. The transition took about a week to sort out, including the creation of a new category chooser that not only presented the collapsible hierarchy just like the site’s Site Map, but also includes an XMLHttpRequest-driven LiveSearch and a tab with suggestions based on the words in the resource title. As a result of the switch, we’re also now incorporating real GCL categories for each page rather than just using “Local Government” across the board. I also think that the new LGCL Category Chooser may well be The Best In The Land, even if I do say so myself.

  3. The main page of the administration area now also uses the LiveSearch to help publishers track down resources. This has proven immensely popular (not least with me, since I’m often phoned about problems with specific resources, and navigating the collapsible sitemap is just clumsy enough to be annoying).

  4. I’ve released a small bugfix update to XML::Generator::RSS10::gcl and XML::Generator::RSS10::lgcl.

  5. I think I may have finally cracked a problem that’s been bugging me for months and months. Sometimes, despite the HTTP headers and meta tags in the page asking for data to be sent in ISO-8859-1, HTMLArea (or, rather Internet Explorer in the appropriate mode) sometimes sends data in Windows-1252, UTF-8, or a mix of the two. This results in weird encoding conversion errors that manifest as Ās and the like scattered throughout a page. I’ve been honing a routine using Encode::Guess and, initially, GNU Recode, but recode would occasionally and mysteriously elide large portions of text. I then discovered, by chance, the //TRANSLIT option to iconv. Consequently, and after some kludging to get the pound sign to work, I’m now hopeful. We’ll see if (again) I’ve spoken too soon.

Tue, 09 Nov 2004

The New Article Seven Site

…I say “site”; really I mean “page”.

In a fit of insomnia last night, I decided that I was waiting too long for myself when it came to the redesign of article7.co.uk. I was hesitant over duplicating the old site structure, but indecisive over what the new structure should be.

How much to focus on the CMS? Should the CMS have a name first? How elaborate to make the case studies? Should I keep the graphic design angle?

But I’ve been unhappy with the old site for some time now. So, I decided to take the homepage layout I’d put together and simply fashion a new single-paged site out of it, merely linking to a selection of sites from the portfolio.

I may develop it from here, such that most of what you now see becomes a link to more information. But at least this represents my current direction rather better than the old site — for one thing, as promised, it’s entirely CSS-driven. Unfortunately, it breaks rather if you bump up the text size (IE won’t let you, as I’ve defined the starting size in pixels) — this represents an accessibility problem, of course, which is a touch hypocritical. I’ll work on addressing that soon enough.

But for now, at any rate, real work is calling again.

Things I Have Discovered in the Past Two Weeks

  1. If tempted to approach a speaker at a conference after their presentation, don’t. They won’t be interested in what you have to say, and you’ll feel like an idiot.

  2. In developing for the web, always assume that an event will happen often, even if you believe it’ll only happen rarely. This is especially true for systems that automatically email publishers in the event of the CMS discovering a broken link.

  3. I’m older than even I think: two puffs of an irregular cigarette at a party and it’s instant bedtime for Andrew.

  4. Blog comment spammers seem to lowercase the URL of the page they’re spamming. This has the nice side-effect of hiding most comment spams from patrons of this site, because all my category URLs officially have at least one capital letter in them.

  5. Crikey.

Wed, 03 Nov 2004

Bugger.

Like many, I’ve largely avoided writing about the US election, despite its prominence in my thoughts for the past few months. My sentiments now it’s over are largely echoes of Todd’s and Garrett’s, so I won’t dwell on them too much.

It strikes me, however, that today John Kerry has displayed characteristic intelligence, dignity and genuine concern for the welfare of his country, demonstrating just how good a president he would have made. And that four years ago, Bush’s belligerence in the face of the popular vote was ample foreshadowing of the qualities the rest of the world (if not, evidently, America itself) finds so loathsome. In defeat, Kerry has proven himself to be more presidential than Bush could ever hope to be. That, I suspect, is the great irony of this election.

I’m heartened by four things:

  1. I’m not a citizen of the USA, so I probably should care more about UK politics than US politics.

  2. Bush can’t have a third term.

  3. The other great irony of this election.

  4. Ladies and gentlemen, it looks like perhaps we have a future African-American president

Battlestar Galactica

So I have a new favourite programme. Sky repeated the pilot episodes shortly before beginning the new series, and although I’d heard of these on Slashdot, it took a sincere and enthusiastic recommendation from a colleague to get me to watch them.

I was glad I did — whilst I have faint nostalgia for the old series, I knew it was plainly a cheesy As a child growing up in Hitchin, we were firmly in Anglia TV country. The family of my best friend Chris, who lived literally down the road, had a pair of TV aereals connected to a switcher box. One aereal picked up Anglia, the other Thames. This was a time when such things meant something: Anglia would resolutely show repeats of The Dukes of Hazzard, whilst Thames forged ahead with the geeky wonder of Battlestar Galactica, thereby rendering the latter unavailable to the majority of the fen-folk. This was only one reason why Chris and I were friends, of course, but it would certainly have featured prominently on the list.

The new Galactica pitches itself as a thoroughly grown-up interpretation of the original series’ premise, and so far has delivered admirably on this. The apocalyptic overtones are handled carefully and with genuine force, the characters are, by-and-large, flawed and believable, and the drama is predicated on human concerns. I’m also fond of the Dead Can Dance-like timbre of the opening credits. And, astonishingly, we’re getting it ahead of America.

On the downside, the “frakking” affectation is simply irritating.

Incidentally, the recommendation I received emphasised the attraction of Cylon Number Six. For what it’s worth, although I can appreciate she’s certainly striking, Crewman Specialist Cally floats my boat rather more substantially. I mention this only in the interests of full disclosure, you understand.

Sun, 24 Oct 2004

Uses for Google, #36 in a series of twelvety

I, for one, welcome our new meme-observation overlords.

Via Ben Hammersley

Fri, 22 Oct 2004

Server Woe

I don’t want to get excessively melodramatic about this, but yesterday really, badly, sucked.

About three days ago, my server suffered a kernel panic and crashed. Once phone call to the managed hosting company, Rackspace and it was promptly brought back up. At the time, it seemed like an isolated incident, and there wasn’t anything to suggest that there would be further problems down the road. We kept an eye on it anyway.

Then, the weird errors started. Bits of software that had worked unmolested for years began sporadically to refuse to compile. And another crash.

Rackspace had already replaced the memory and the processor on the machine just in case. But it was becoming pretty clear that the filesystem, or the hard disk, or both, was breaking apart before our eyes.

As the day progressed, this became more and more horrible to watch. Trying to retrive files from the server became like fishing bits of imprudently-dunked biscuit from a cup of tea. We had to migrate to a new server, and sharpish.

Now, the server has a certain amount of custom configuration on it, and generally speaking, setting it all up in the first place was something of a headache. There’s also about half of CPAN on there. And I was growing worried that we’d never get the MySQL data out of there intact.

The server does have a tape backup, but it’s a fairly rudimentary setup whereby the drive is backed up nightly to the same tape. No rotation. So, if the filesystem was screwed, the backup might have been as well.

As it turned out, a member of the Rackspace support team in San Antonio named Rich is an absolute miracle worker. The drive was ripped out of the old server, and added to a mount point on the new. Rich then set about recreating everything for me straight from the old drive. Somehow, by midnight, he’d got it working. I’d gone from despair to jubilation in about three hours and a bunch of phone calls to the States. There are times when the relentless optimism of our American cousins is just what one needs.

Alan in the UK also deserves his dues for sticking with the support ticket and being generally very helpful. But Rich is my new hero.

I’m aware, of course, that I’m in danger of making this a giant advert for Rackspace. But I don’t care, really. I’ve had frankly appalling service from other hosting providers before, and consequently I’m now firmly of the opinion that you gets what you pays for. If singleminded attention to your case in an emergency floats your boat, I know who to recommend.

Of course, there are lessons from this. We’re investigating investment in a resilient, multi-server, load-balanced affair for Woking now; and there are better backup solutions than tape that really weren’t around when I started all this. But for now, I’m just counting my blessings that we’re still here at all.

Sat, 09 Oct 2004

Most Liberal

President Bush, in the second national debate:

First, the National Journal named Senator Kennedy the most liberal senator of all. And that’s saying something in that bunch. You might say that took a lot of hard work.

He meant Senator Kerry, of course, but the reason I quote this is simply to ask: how the hell is this a bad thing? Bush’s intent was plainly that his audience should recoil in horror at such a slur, and that all normal Americans should do the same. Liberal? Why, sir, you might as well have called me a communist.

Wed, 06 Oct 2004

TextMate vs SubEthaEdit

The widely-anticipated new text editor for Mac OS X, TextMate has now been launched.

I’ve been using SubEthaEdit as my main editor for some months now, and so it’s hard to distinguish what’s a genuine point of comparison from what’s simply a case of having to unlearn old habits. But there are a few small features in SubEthaEdit that I know I’d sorely miss if I were to switch to TextMate wholesale.

  1. The ability to tint the background colour for text that I’ve changed. It’s so useful to know what I’ve touched in a file when looking for bugs, particularly when edits in a single session are dotted around a variety of different subroutines.

  2. SubEthaEdit creates a function popup at the top of the window that you can use to jump to a given block. In Perl, this is a list of named subroutines. It’s great to open a file and be able to jump straight to the subroutine you’re looking for.

That’s not to say that there isn’t some great stuff evident in TextMate right from half an hour’s playing with it. In particular:

  1. Being able to “fold” blocks down to their opening definition to get them out of the way. This seemed a little buggy when I tried it in one of my Perl modules, but that might simply be because there isn’t a Perl mode (or “bundle”) yet.

  2. Columnar selections.

  3. Being able to conduct a regexp-based search-and-replace across a number of files at once.

  4. Macros and snippets.

  5. Auto-completion of brackets and so forth.

Similarly, there are niggles with each.

Of course, TextMate has only just been released, so it’s hardly fair to criticise it too strongly it on the basis of a couple of irritants that may well get fixed. I’m certainly looking forward to seeing any Perl bundle that gets released, as without this, I really can’t form a comprehensive judgement for my own needs.

But, ultimately, although TextMate has delivered on its hype impressively, I suspect that the things I like most about SubEthaEdit may become deal-breakers for me.

Tue, 05 Oct 2004

Seeing McEnroe

I’d like to say it was the fulfillment of a lifelong dream, but in truth I only got into tennis in 2000 thanks to the missus. My mum used to be an avid watcher of Wimbledon, which I resented horribly as a kid, as it meant I’d miss all the children’s TV for a fortnight. In truth, I probably just needed the rules explaining to me.

Anyway, on Sunday, Shelley and I got to see John McEnroe play, at the Superset tennis event in Wembley Arena. The match was frankly superb — perhaps better than the matches we saw in “People’s Sunday” at Wimbledon this year (especially given that we didn’t get to see the Henman match). At 45, McEnroe is still capable of some stunning tennis.

The Superset format is this: over the course of one day, eight contestants are whittled down to one winner in a series of single-set matches. The players can challenge line calls, which are checked instantly using HawkEye. The set can continue up to ten games, after which the regular tie-break is invoked. For my money, ten games is a touch too long — but that’s a quibble. It’s very hard to get to see real, competitive tennis played live, so any extra format that achieves this can only be a good thing.

Tue, 28 Sep 2004

The Uphill Battle

It seems that web standards evangelism still has some way to go. From a letter to CMS Watch’s Ask Tony:

I work for a company who decided to outsource their website to an external web design company. We had many problems from the start dealing with the designers since they were dead against anything flashy on the site and kept on insisting it had to be full-bore Bobby standard and compliant with certain web standards.

The correspondent provides this as an example of “bad consultancies in the UK”, and goes on to complain that the implementation of a CMS where each page is generated dynamically was directly responsible for a catastrophe in their Google ranking. Having said that, it does seem that their CMS implemented URLs based on a query string (methinks I see the hand of PHP here), which may actually be a valid point.

However:

  1. Web standards != “not much more than text on a page with links”
  2. Generate-on-demand CMS != Google-unfriendly URLs

The transformation of woking.gov.uk into a CSS-driven bonanza begins very soon now.

Thu, 16 Sep 2004

Creative Conflicts

One of the biggest advantages of generally only working for a couple of clients (apart from the fact that I’m so hopeless at the Hard Sell that if it weren’t for Woking and FISITA I’d have gone hungry years ago) is that the long-term trust that’s been built tends to mean one gets one’s way, or at least knows the grand fromage’s tastes well enough to design accordingly.

For the rare occasions that new design jobs come along, D. Keith Robinson’s latest post is well worth remembering, and strikingly familiar. I remember when I used to subscribe to Internet Magazine and, every month, all the featured sites would be Flash monsters, with 45° angles everywhere, and wee pixellated isomorphic figures. This was real design. This was the future. I decided to save my angst and not renew the subscription.

Just recently, I was asked to turn the pointer into a crosshair on a forthcoming site — clients still feel that the way to get make their site stand out is to fill it with gimmicks. I used to get asked all the time if it was possible to automatically obtain a list of the email addresses of every individual person who visited the site. There’s a palpable difference in a lot of people’s minds between their own browsing habits and preferences and those of anyone else viewing their site. I hate spam myself, but would you mind sending this email out to everyone in the country? It isn’t advertising h0t m1jit sExxx, so I’m sure no-one will mind, and I’ll get lots of yummy punters.

Fri, 10 Sep 2004

XML::Generator::RSS10::egms

I’ve finally added my first module to the CPAN.

XML::Generator::RSS10::egms helps to incorporate the mandatory eGMS category metadata into an RSS feed. It’s in use right now for Woking’s Latest News feed.

Of course, it’s only the slightest bit useful if you’re producing an RSS feed for a UK government site (and only then if you’re doing that in Perl), so its audience is probably fairly minimal. Nonetheless, the LAWS Syndication Guidelines are hairy enough that someone might find this really useful. It can’t hurt, at any rate.

It’s accompanied by XML::Generator::RSS10::gcl and XML::Generator::RSS10::lgcl, which are rather more trivial, and should install automatically if you use CPAN.pm for the egms module.

Wed, 08 Sep 2004

Callow and the Tardis

So, it looks like he’ll be there as Dickens. I swear, though, I sincerely thought Simon Callow would have made the perfect Doctor.

Mon, 30 Aug 2004

Woking’s CMS and the Content Management Landscape

David Wheeler’s new article for perl.com is a superb explanation of how Bricolage fits into the CMS landscape. I do wish more CMS vendors would summarise their products in that way: this is what we mean by workflow, this is how we expect you to author content, this is how your site gets delivered to the visitor. Often, wading through the swamp of buzzwords on CMS vendors’ sites leaves you feeling more confused than when you began. Sometimes, the executive summary summarises nothing.

When I started developing the new CMS for Woking Borough Council, I was aware of Bricolage, and somewhat concerned that it might have been better to customise it instead of writing a new system more-or-less from scratch. I’m now reassured that, whilst Bricolage is almost certainly a better piece of programming than my efforts, it occupies a slightly different position from the one we needed.

The key element here was that the existing Woking site contained a fair sampling of CGI applications that we wanted to preserve in the new site — things like the Planning Applications search, The Woking Forum, the events guide and the Council meetings system. These systems needed, of course, to present their content dynamically, but in a way nonetheless integrated into the CMS navigation and presentation system. Consequently, it seemed clear that the presentation aspect of the site itself should be dynamic, rather than generated as static pages by “publishing” content from the CMS administration area.

This means that there are four principal resource types within the Woking CMS:

  1. Text resources
  2. PDF resources
  3. Database resources
  4. External database resources

The first of these is the traditional web page — content is authored visually, using a highly customised version of HTMLArea, which the presentation system wraps into the site templates and navigation in a predictable way. PDF resources are PDF documents uploaded to the site and integrated as resources within the site navigation. Database resources are the CGI scripts I mentioned above, converted to output their content to the presentation system in the same way that the static content of a text resource is passed through. Finally, external database resources are external systems whose output is proxied over HTTP and transformed in order to be passed through the presentation system as well. This means that, with a suitable XSLT filter, more-or-less anything can be integrated into the site in a way that’s invisible to the end user.

When authoring content for the site, each resource is assigned content (or a content source) according to its type, is assigned metadata, and is assigned a “nodename” and a single location within the site hierarchy. URLs are then constructed by working from the nodename upwards through the hierarchy. So, the page on “the Woking Martian” has the nodename martian and is assigned as an immediate descendent of the Places to Visit resource, which has the nodename attractions and is in turn an immediate descendent of the Leisure and Tourism resource, which has the nodename leisure and is an immediate descendent of the homepage. Thus, the Martian’s canonical URL is http://www.woking.gov.uk/leisure/attractions/martian. A URL history is kept, so that if a resource is moved around the site, its old URLs will continue to work.

The administration area is comprised of a database of editing options which can be available as options applying to the site as a whole, to individual resources, or to any resource matching certain criteria. Options can be restricted to site administrators or to publishers according to their relationship to an individual resource in question. For example, the option to edit the content of a text resource is restricted to site administrators and to the owner of the resource — the owner being the publisher who originally authored the resource. This means that we can customise editing options in a very precise way — there’s an editing option, for example, to add a new PDF to the Forward Plan of Key Decisions that’s only available to site administrators and the owner of that resource.

As I’ve mentioned in a previous entry workflow is assigned according to the category metadata assigned to a resource. The relevant Service Head is alerted by email whenever a new resource is posted in one of their categories (or if certain kinds of changes are made to a resource already published in that category), and they’re invited — well, actually, nagged — to either approve or defer that resource from publication to the live site. If they approve, a further message is sent to the site editor (one of the site administrators) inviting them to do the same. If she approves, the resource is made available to the public.

Aside from Dublin Core and eGMS metadata, a resource can additionally be set to be hidden from navigation altogether, assigned a colour scheme, set to persist down the hierarchy, or set to masquerade as another resource dependent on context.

As my reader will know, this is all built on mod_perl, so those CGI scripts have become Apache::Registry scripts, and the whole thing is held together by customising parts of the Apache lifecycle.

I guess in landscape terms, that positions it somewhere over there, with the camels — no, not the adults or the baby ones, but the teenagers; yes, that’s it, the “individual” one with all the makeup.

Fri, 27 Aug 2004

The Metadata Quest

Well, I can add another entry to the catalogue of trivial victories that this blog has become. As I’ve previously noted, each month, SIteMorse publish a report examining the performance of all local authority web sites across the UK, providing the obligatory competitive edge by ranking each site against each other for overall performance against the various criteria:

  1. Accessibility (WAI-A)
  2. eGMS metadata compliance, and general metadata provision
  3. Number of errors and warnings, including HTML validation problems, server errors and broken links
  4. Acceptable download times, for 56k dialup, 512k ADSL and 1m broadband connections

The Woking site has achieved 100% accessibility compliance in the report for a few months now, although next month’s report will include accessibility scores against WAI level AA — so we’ll have to see if we’re still OK there. But we’ve persistently been in the doldrums with respect to our metadata compliance, despite a number of attempts to correct the situation.

After last month’s report, we made some enquiries to see just where we were going wrong. The main problem was that my CMS was only providing “description” metadata where the relevant (optional) field contained data. So, we changed this to duplicate the title of the page in the description metadata in the absence of anything more specific.

Voila — 100% eGMS metadata compliance, and an “OK” for general-purpose metadata provision. We’re now one of only 14 local government sites to have achieved that, and our overall rank has increased from 148 to 85 (out of 461).

Of course, whilst we’re now obeying the letter of the law, we bend the rules to get the grades, as do many of the other “metadata compliant” sites. For starters, duplicating the title in the absence of a specified description, whilst better than nothing, isn’t ideal. I did experiment with providing an automated summary instead, but the results weren’t good.

Furthermore, one of the principal tenets of the eGMS is that each resource should include metadata indicating a category chosen from the Government Category List (GCL). A site may additionally choose a category from a separate taxonomy, as long as the taxonomy is indicated.

The Woking CMS insists on a category taken from the list designed for local authority use as part of the APLAWS project. This list (as far as I can tell) doesn’t have an explicit mapping to the main GCL, so it’s not possible to derive a GCL category automatically from the one(s) chosen for a resource from the APLAWS list.

Other problems with the original APLAWS list have resulted in the creation of a somewhat expanded category list for local government, known as the LGCL, through the LAWS project. The LGCL does have a mapping to the main GCL, so if you choose a category from the LGCL, you can easily automatically derive a category from the GCL and display that.

But upgrading to the LGCL is a problem for us, because we’ve taken the unusual step of piggybacking part of the CMS workflow onto the APLAWS list. The service heads in the Council have each signed up to be responsible for a number of categories from the APLAWS list. When a publisher makes an amendment to a resource, the relevant service head is contacted to approve that change for publication to the live site on the basis of the primary APLAWS category that was chosen.

The system works really well, but of course upgrading to the LAWS list would mean that we have to go through the new list and reassign service heads to it. That would be mildly burdensome, but rumours are now flowing that a further revision is in the offing, based on an integration of LAWS with other alternative taxonomies, which would mean that we’d have to do it again in the near future.

I’m actually inclined to believe that the near future wouldn’t really be so near, and I suspect that we’ll do the work to change over to LAWS before very long anyway. In the meantime, we do what most other “metadata compliant” sites do, and that’s publish a metadata category from the list we’re using, and blindly stick on the “local government” category from the GCL for all pages. Consequently, even though we’re 100% compliant, there’s still room for improvement.

Thu, 19 Aug 2004

TextMate

From LoudThinking:

TextMate is the answer to all my editing prayers. There I’ve said it. TextMate has single-handedly rendered TextEdit and Xcode obsolete, contained Subethaedit to strictly collaborative tasks, and stopped me from feeling sorry about not liking BBEdit. Yes, it’s that good.

Oooh!

Thu, 12 Aug 2004

Theme Song

This is lovely and entirely appropriate to the past few weeks of my life.

(No, the Markdown thing didn’t take that long — only a day, in fact. I’ll write about the other stuff when there’s something to see on HM Internet…)

Tue, 10 Aug 2004

XHTML-to-Markdown XSLT

I’m making available this XSLT stylesheet to transform XHTML into Markdown-formatted text, under a Creative Commons license.

If you find any problems with it, or make any improvements, do please let me know.

Fri, 06 Aug 2004

Andrew’s Latest Cycling Calamity

The trouble with only having learned to ride my bike a couple of years ago is that I missed out on that period as a kid where you learn how to fall off your bike properly; where a spectacular catapult over the handlebars leaves you with nothing more than a grazed knee.

As part of our get-buff-before-the-wedding fitness regime, the missus and I are back in the habit of biking round the Thames path between here and the flood barrier. For most of the way, the path is divided between a footpath and a cycle path, but by and large, pedestrians have an annoying habit of lurking on the cycle path and leaving one with only a couple of inches to get past.

Thanks to a combination of gross inexperience and the physical dexterity of a chair, I’m shockingly bad at being able to turn corners without an ocean of room on either side. So yesterday — it had to happen sometime — squeezing past a mother and son, I found myself needing to make, oh, about a 30° turn to avoid a low-level wall, and encountered a rapid sense of panic as I realised my speed was marginally too great, the gap marginally too small, and my skills far too enfeebled, to accomplish the task.

They teach you, as a child, not to brake hard with the front brake, as the bike then has a tendency to stop dead, and the momentum to tip you forward and over the top. This was the lesson it took me until thirty years of age to learn, as I landed with a good, solid, full-bodied slap onto the unforgiving pavement, having abruptly dismounted the bike above the handlebars.

It’s a testimony to Apple’s engineering prowess that the force of a 14 st. 10 lbs gentleman landing on the iPod fazed it not one bit. A cheerful ditty by The Beloved (entitled, would you believe, Up, Up and Away) continued to play gently as I tried to recover my wind. Less can be said for my ribcage, which is as sore as a spanked schoolboy this morning, despite a conspicuous lack of bruising.

The missus enquired whether it was the bemused pedestrians’ fault, offering to “kick their arses”. This kind of loyalty in the face of evident incompetence is a beautiful, beautiful thing, although I suspect the prospect of a good arse-kicking mitigated any skepticism she may have had regarding their culpability for the whole sorry affair. Nonetheless, common-sense won, and the pedestrians continued on their way.

iMix

Apple have today redesigned the layout of the iMix opening page in the iTunes Music Store, dividing the screen into three columns: Top Rated, Most Recent and Featured.

It’s a pissy little thrill, but both my iMixes are right there on the front page in the Featured column.

Thu, 05 Aug 2004

Groklaw’s New Lexicon

I’m not sure what’s happened to this story from GrokLaw, but as viewed this morning in my copy of NetNewsWire some fantastic neologisms seem to have appeared — notably legalthreat and Linuxenemy.

It’s so tempting to register linuxenemy.com as a gag domain and point it at SCO, it really is.

Wed, 04 Aug 2004

More on “Text Only” Versions

The Register is carrying a story lifted from IT Analysis that’s essentially a plug for UsableNet’s LIFT Transcoder, which seems to act in a way not unlike the BBC’s old BETSIE filter.

The story claims that, whilst it’s only a “band-aid fix” for sites that are otherwise inaccessible, a text-only site is nonetheless an “ideal solution for a small group of users”. The author seems to feel that adding a text-only layer (using automated tools such as the LIFT transcoder) to any site, even if it’s already accessible, is a positive thing.

I guess this is another side to the debate, and I’d be genuinely interested to know the extent to which others feel the addition of a text-only layer to an otherwise WAI-kosher site is valuable. Whilst Matt and I ended up agreeing to disagree on whether the use of terms like “segragation” was appropriate, our brief email (and blog-comment) conversation made me realise that the real focus of his anger was the way some sites pay lip-service to accessibility by tacking on a badly-maintained and begrudging text-only version. It’s much easier to see someone’s point when you no longer feel they’re attacking you.

Whilst it’s not strictly a text-only version, I’ve been giving the whole issue of the value of our “Easy Access” version more thought, in lieu of being able to actually knuckle down on the CSS-only redesign of the Woking site.

I’ve come to the conclusion that the Easy Access version is worth keeping, because:

  1. It offers a genuinely useful alternative view of the site content, that can strive for truly-comprehensive, AAA-level accessibility, in ways beyond the design changes afforded by a simple CSS-switcher.

  2. It offers a better screen-to-printout relationship for a “printer-friendly” view of the page.

We may have to re-brand it, though, to emphasise the idea that the main site will have become (at least) AA-level accessible itself. And then, I hope, we can go for RNIB accreditation on the main site. When we first launched the Easy Access site, the RNIB refused to assess it independently of the main site, claiming that we couldn’t get the award unless the main site passed muster. We felt aggrieved by this, because it represented a profound change of policy. So, although we thought that the main site would probably pass anyway, we never pursued it further. If the Easy Access version becomes simply a “graphics-lite” version, reopening that wound may become a realistic prospect.

Fri, 16 Jul 2004

Oh, the Shame

Good Lord. It turns out that the object of my earliest “celebrity” crush was an adolescent Patsy Kensit.

In mitigation, I was nine at the time.

Fri, 09 Jul 2004

Maybe the Access Version becomes the Print Version?

I’m itching to start work on the CSS version of the Woking site. The list of things to do this month is overwhelming, however, and the CSS version frankly isn’t a priority. It’s just something I really want to do.

That isn’t to say that the site doesn’t already use CSS extensively, however. It does, in the way that most sites from the past few years do: to adjust fonts, colours, background images and so on. What the CSS doesn’t do is the layout and presentational chrome, and I want it to.

As I’ve already written, my feeling is that this will give the regular version of the site an accessibility boost, to the extent that the “Easy Access” version I developed might struggle to find its place in the world. The discussion (some of it over email) ensuing from my previous post on the topic of access versions reassured me that what we’ve done isn’t so much segregation as simply offering layout alternatives — a “design-lite” version, if you will.

Well, Cameron Adams has written on the topic of print versions of pages. His point is that, whilst a CSS-driven presentation of a site can offer a wholly different version simply and transparently through the use of a print stylesheet, people don’t expect (or like) the printed page to look wholly different from what they see on screen — it’s disconcerting.

I really think he’s got a point, and it pleases me no end. The link on each page to its “Easy Access” version is already labelled “fewer graphics, and better for printing” — the notion that the Access version doubles-up as a print version is already very much there. The CSS-based redesign of the site might simply re-prioritise the Easy Access version as a print version (from which you can continue to browse the site, if you prefer).

Itching, I tells yer, itching.

Mon, 28 Jun 2004

Access Versions

In The accessibility of segregation, Mike Davies draws a parallel between having a separate accessible version of a site with Apartheid. Bestkungfu takes up the baton, comparing accessible versions of sites to old segregation legislation in the US.

The argument is that with current web technology, making one’s main site accessible is an entirely feasible goal, and that accessible or text-only versions of otherwise inaccessible sites is saying “this site isn’t for the likes of you; you should go round the corner instead”.

I have a couple of problems with this line of reasoning. The first, naturally, is that Woking’s “Easy Access” version comes dangerously close to being Part Of The Problem (and nobody likes to be told that their best efforts are a bit like racism), and the second is that making the comparison totally belittles the damage and hurt that Apartheid and other segregation laws caused. The purpose of segregation laws was to keep blacks out of white-only areas. The intent (whether misguided or not) of creating a separate accessible version of a site is to overcome obstacles that are already there. Comparing the two is vastly over-aggressive and inflammatory.

The physically-disabled generally have separate toilets in public buildings. This isn’t segregation.

The physically-disabled often have separate, allocated parking spaces in car parks. This isn’t segregation.

There are sometimes push-button exits at wheelchair-height for otherwise difficult-to-manage doors. This isn’t segregation.

The natural tendency when faced with the notion that one’s site might not be fully-accessible is to try to create new features to address this problem. It just isn’t a case of “Good — keep those blind bastards out of our good, pure, 20-20 kingdom” being followed by “You mean we have to accommodate them? Well, I suppose they can have their own site over there, as far away as possible from the rest of us”. The fact that, with advanced CSS, it’s not necessary to treat accessible site design as being like accessible loo-design does not make this tendency bigoted. It means, more than anything, that advanced CSS isn’t yet in common enough use.

A fair criticism of accessible-versioning, however, is that the accessible version of a site is almost always hidden away, that switching between the two versions is almost impossible, and that content for the accessible version is always an afterthought to the main version. The Woking “Easy Access” version is very deliberately not like this. There’s a clear link on each version of a page to the alternate version, and the versioning is entirely handled as a function of the CMS’s presentation layer. Whenever a new page is added to the site, the “Easy Access” version goes right along with it.

The main site is always (at least) WAI-A compatible. The “Easy Access” version is always (at least) WAI-AA compatible. The main site is actually pretty accessible.

But it was designed using lots of presentational HTML and plenty of images. It didn’t print well. So, the function was developed for the old CMS (and, naturally, I carried it through to the new one) to allow “frameworks” — an opportunity for the CMS to completely re-version the site using different templates and content filters as part of the presentation layer. There are four frameworks in use for the site at the moment — two of them are internal, and one of them is the “Easy Access” version. It’s intent is to make an already pretty accessible site even more accessible, whilst presenting a stripped-down, “graphics-lite” version for anybody who wants it. The stripped-down version is also good for printing.

Now, much of this will be comprehensively addressed by the “Web Standards” version of the site I’m hoping to be able to start on soon. This will remove much of the presentational markup and “chrome” from the HTML, and separate (separate! Gosh!) print stylesheets will address the problems with printing from the main site. We could even present a stripped-down version by incorporating a stylesheet-switcher widget.

But the content-filtering offered by the “Easy Access” framework will still probably be useful, in that as time goes on we’ll be adding more and more HTML from external sources into the site. The e-Forms tool in which Woking have invested produces even more laboriously-presentational markup than our own site. Being able to run this through some very targeted XSLT will, I’m sure, reap some accessibility benefits.

We’ll see — perhaps it would be best to run the e-Forms through the XSLT stylesheet anyway, and dispense with versioning altogether. All I know is: the “Easy Access” version has been a great success, and has proved useful to many people other than the blind or partially-sighted. None of them feel discriminated against by its presence.

Wed, 23 Jun 2004

A Sudden Rush of Comfort

Unexpectedly, my new Mirra chair arrived this morning, replacing the hard dining room chair I’d had to make do with since my previous office chair decided it wanted to start reclining enough for me to be unable to see the desk surface any more.

The chair is now pretty much the most expensive item of furniture in the studio. I didn’t get that sudden “now that’s comfortable” rush I’d hoped for, but I guess I didn’t really expect to. The more switches and levers there are to adjust the thing, the more work one has to put in.

Such a shame I’m out at a meeting all afternoon and don’t get to try it out properly.

More on the SVG::TT::Graph Problem

Not long ago, I wrote about some problems I’d encountered with integrating SVG::TT::Graph into the log analysis system within my CMS. I’d figured out that caching the generated SVG files on disk and displaying those avoided the problem of the graph never actually making it through to the client.

Well, it turns out that this wasn’t the whole story. It seems that running the module in a mod_perl environment can cause a graph to be created empty if the Apache child process running the request has already created a graph at some point in its lifetime. So, whenever I restarted the server, the graph would work — and of course it would sometimes work without restarting the server, depending on which Apache child process got picked to handle the request.

Evidently something’s setting a variable that isn’t being destroyed properly between requests in mod_perl. Try as I might, I can’t find out whether it’s something in my code, or in SVG::TT::Graph itself.

So, I’ve edited my code to run the graph creation as a regular CGI script instead of a modperl registry script, and it now genuinely works every time. The additional overhead of running it through CGI rather than modperl is troubling, but hardly a bottleneck in terms of speed — that honour goes to the volume of data in the log table of the database.

Hopefully, I’ll be able to find the problem with running it through mod_perl and correct it, but at least for the time being it actually works.

Tue, 22 Jun 2004

In Memoriam

It was with some shock last week that I discovered my namesake and recent friend Andrew Green, the “Spectre Inspector”, had died aged 76.

Andrew was probably the UK’s foremost ghost hunter, having written extensively on the subject for many years. I first encountered him as a reviewer in the Fortean Times — you can imagine the surprise when I saw my own name on the page!

When my thesis became part of Fortean Studies 7, I was aware that it would have been wise to adopt some kind of pseudonym to avoid confusion with him, but couldn’t settle on anything I liked. Andrew then became aware of me through a review of the journal which assumed he’d written my piece, and set about finding me.

For a short time last year, we exchanged letters, and about this time last year managed to meet up at his home for Sunday lunch. We discussed his biography, which was currently in being written by an old colleague of his from his days as a journalist. We also spoke considerably about our shared view of the phenomena we both studied. Andrew was very amused that two people with the same name should be working in the same, obscure field, and both living in South-East England. We promised to meet up again.

To my very great regret, that never actually happened. He was kind enough to invite me along to his next meeting with his biographer, but I was on holiday when he left the ansaphone message about it, and arrived back too late to be able to make the necessary arrangements. At about this time, the Woking CMS project really started to kick in, and our correspondence trailed off as I became busier and busier.

I only discovered of his passing when we boarded a train to meet some friends in town. There were a couple of newspapers left abandoned on a seat, and the chap in front of us cherry-picked the Metro from the pile, leaving Shelley with a supplement from that day’s Times. Neither of us ever read the Times, and neither of us ever read the obituaries. But the one time we found ourselves doing so was the one time we’d ever notice the obituary of an erstwhile correspondent on the subject of post-mortem survival. Somehow, I find that entirely fitting.

Tue, 08 Jun 2004

The “Myth” of Accessibility

In WCAG and the Myth of Accessibility, Kevin Leitch argues that accessibility based only on access for users with visual impairments isn’t really accessibility at all. Leitch notes that, in the UK, 2.5m people are known to suffer from a disability affecting their seeing or hearing, compared to 3.9m people with a learning or understanding disability. He then observes that the WAI guidelines focus on design strategies to help access for people with seeing, hearing and (to a lesser extent) manual dexterity disabilities, but have little to say on the subject of learning and understanding disabilities.

All of this, of course, is true. The WAI guidelines don’t address the needs of people with learning difficulties in any direct way. Now, the argument that this makes the stated goals of the WAI a myth is clearly specious, and I really don’t want to waste any time with it. But I feel that the article misses a really important distinction — one that the later comments posted to the article have picked up on: accessibility for people with visual impairments is mostly a design issue, whereas accessibility for people with learning disabilities is mostly an editorial one.

Leitch includes a link to a very helpful resource from Mencap outlining tips for accessibility for people with learning difficulties. Here’s a representative handful:

This is all good stuff, and contains some material highly relevant to designers when putting sites together. However, even those points are largely an expression of general usability principles, writ large. The emphasis seems to be that whatever’s important from a usability perspective (make text easy-to-read, make navigation and design consistent, employ sensible information architecture) is essential from the perspective of access for people with learning disabilities. It would seem that the poor design choices that irritate pretty much everyone, make the experience genuinely hard for people with learning difficulties.

To the extent that any putative guidelines for accessibility for people with learning difficulties would articulate these usability issues, designers interested in accessibility are already talking in the right terms. Accessibility doesn’t end with the WAI guidelines, and no-one’s really arguing that it does.

Other points in the list, however, plainly concern the content of a page itself — the complexity and precision of the language used, and the employment of illustration. Genuine accessibility for people with learning difficulties involves keeping an eye on the punctuation used and the length of sentences. These are clearly editorial issues.

Sometimes a designer has a degree of control over the content of a page, and in the days of static sites, designers would spend a great deal of time refining nuggets of text to within an inch of their lives. But when you’re designing for a site that’s to be run through a Content Management System, you’re (mercifully, in my view) freed from this. The client writes their own content and your role in accessibility is to make sure that the markup used for this content is kosher. In these circumstances, a designer simply can’t have jurisdiction over whether an author uses jargon in the same way that you can over whether they use alternative texts for embedded images.

Fri, 28 May 2004

Would you like some smugness with that, sir?

Whilst I can’t claim not to be delighted by this, what’s worrying is that some of the councils involved in the SiteMorse report actually managed to achieve a 0% accessibility rating. Given that the criteria were those of WAI-A compliance, achieving 0% is quite a remarkable feat. So much so, in fact, that I can’t help wondering just what happened.

The accessibility objectives for the Woking site are to ensure that the regular site is always 100% WAI-A compatible, and that the “Easy Access” version is always at least WAI-AA compatible. We’d like to achieve WAI-AAA compatibility for the Easy Access version, but full compatibility to that level is a much harder task, and so the objective simply reflects that we might well not completely succeed for all pages.

Clearly, my aim to translate the design of the regular site into a standards-based design is likely to raise the bar somewhat, as the regular site might then achieve WAI-AA compatibility itself. We’ll see how that project might impact the utility of the Easy Access version once I get the chance to devote some serious time to it.

Thu, 20 May 2004

Web Analytics in the CMS

CMS Watch has a recent feature advising vendors to build analytics into their CMS products.

It’s with no small amount of smugness that I realize my CMS for Woking is ahead of the game in this respect, as it contains a fairly substantial traffic analysis system built right into the administration area. Since the CMS revolves around the concept of a resource that may potentially have multiple URLs, it was essential that the CMS could log site traffic in the same terms.

Since the CMS delivers the site dynamically through the miracle of mod_perl, it was straightforward to write a custom log handler that would incorporate a selection of the properties of a delivered page as known to the CMS — the resource ID, the resource owner, the processing time taken to construct the page, and so forth. Having built the database with all this extra information, it was then a matter of writing the analysis code for the Administration Shell. For any period that they define, publishers can now see analysis for:

The log analysis is made all the better by judicious use of Leo Lapworth’s excellent SVG::TT::Graph to create on-the-fly graphs of log data in SVG format. This wasn’t without problems, however. On IE/Windows, when Adobe’s SVG browser plugin doesn’t feel it’s received an entire SVG file, it presents a grey box and the browser continues to “fetch” the file ad infinitum (for example, if the SVG file is actually a 404). Here in the studio, about a quarter of the graphs would “hang” like that, although a browser refresh would pretty much always clear it. The graphs were always being produced and delivered properly, however, so I was at a loss as to why sometimes the browser would behave like they weren’t. They’d render perfectly on Safari, for example, every time.

More worryingly, when the staff at the Council tried to look at the graphs, they’d always see the grey box. I couldn’t help thinking that whatever was playing up for me intermittently was being exacerbated for them by the fact that their web connections all run through a proxy server.

I found that when I saved out a graph as a static file, and got them to visit that, everything would be fine. Whatever was going wrong was down to the way I was generating the graphs dynamically and then outputting them directly through mod_perl. I’m guessing there’s an HTTP header that I wasn’t setting properly (although the MIME type was correct) — or perhaps the plugin needs to acquire the data using byteserving?

So, I changed the system to:

  1. Generate the graph for a specific set of parameters as normal

  2. Save it to a disk cache

  3. Issue an HTTP redirect to the cached file

This also meant that I could jump straight to stage (3) for a given set of parameters if the graph had already been created.

And it worked a charm — the graphs now load properly, for everyone, every time. So, like John Gruber, I thought I should document the issue for the sake of Google!

Wed, 19 May 2004

In defense of <small>

This Design by Fire piece has to be one of the best blog entries I’ve ever had the pleasure of reading. Magnificent work, with much back-slapping deserved.

One comment therein got me thinking, however:

There’s a small tag in there! Come on, Jakob, what the hell is that?

But it doesn’t, and I’m not.

New obsessions

Perhaps it’s having come out of the other side of a nearly year-long programming project, or perhaps it’s the abundance of blogs to which I now subscribe, but I’m finding myself increasingly obsessed with the design angle of what I do. Specifically, I’m thinking (and reading aplenty) about web standards and CSS.

One of the great things about running one’s own business in this field is the opportunity to dabble in as many areas of the design and production process as it’s comfortable to manage. I get a massive kick out of the programming and site development angle; I’m fascinated by information architecture and theory; and I really enjoy the craftsmanship of a good-looking site. At least, as good-looking a site as my limited artistic ability can conjure.

Now, I’ve always considered site accessibility to be at the forefront of the Article Seven design process. As I’ve already described, I very much see accessible design as the result of bearing in mind the technical limitations and content flow issues of non-visual browsing contexts. At its most basic level, it’s not a case of jumping through hoops to make sure a site is accessible so much as not jumping through hoops to make sure that it isn’t, in pursuit of a “killer site” in IE.

But in an excellent Asterisk column on the practical implications of non-standard code, Keith writes:

I’ve been coding like it’s 1998 all week…

  1. I was careful with the tables so that when they got linearised, the contents still made sense. In fact, nesting the tables actually helped with this.

  2. No browser on RISC OS supported CSS, and I had a persistent nagging feeling that those browsers represented the bottom line to which I should be coding. A touch naively, I just didn’t want to see a wholly unstyled page on the Acorn.

  3. Similarly, I hadn’t realised properly that people were starting to use Mozilla and its cousins in serious numbers. I felt that IE and Netscape 4 were the only browsers with CSS support in common use, and I knew that CSS2 didn’t work on Netscape 4.

  4. Browser bugs frustrated me into a twitching mass (and they still do)

  5. I have two main (read: 80% of my time) clients. Until not too long ago, Woking Borough Council used Netscape 4 in-house. And FISITA are quite particular about things looking just-so (and quickly) in IE.

  6. The preceding are all feeble excuses — although, it has to be said, standards-based design (a grand term for “not misusing HTML for presentational effect”) is very much a forward-looking exercise in ‘best practise’; it’s not yet shoddy workmanship to produce accessible design that nonetheless uses presentational markup. The real killer reason is that I’ve been so focussed on getting my Perl, mod_perl, MySQL, XSLT and general XML skills up-to-scratch that I’ve simply neglected the design angle.

Well, the time has come to redress that. I’m setting myself two challenges. First, to redesign article7.co.uk using pure semantic markup + as much CSS as I can master. The second, greater challenge, is to redesign woking.gov.uk as close to that model as possible — whilst still keeping the access framework relevant and worthwhile.

Wish me luck!

Tue, 18 May 2004

GyazMail 1.2

…and, hot on the heels of yesterday’s news, today I discover that GyazMail 1.2 has been released.

I really like GyazMail — so much so that I bought it without ever really putting mail.app through its paces. It works more-or-less in the way that I’m accustomed to an email client working, but the big missing feature for me was always junk mail filtering. mail.app had its own junk mail filter, and most other clients on OS X already integrated with SpamSieve — except GyazMail. This had always been in the list of future plans, though, so I contented myself with some homespun filters and rules and waited.

Well, I’m very glad to say that the new release finally includes incredibly straightforward integration with SpamSieve, and a swathe of other new features (like Finder-style coloured labels for messages) that I’m frankly delighted about. Now, if only I could stop the SpamSieve icon from sullying the Aqua goodness of my dock, I’d be truly content.

Mon, 17 May 2004

SubEthaEdit 2

Unexpectedly (for me, at any rate), SubEthaEdit 2 is out.

Since switching to Mac OS X from RISC OS last year, SubEthaEdit has been very much my editor of choice, and gets pretty much daily use. I couldn’t warm to BBEdit, partly because I quickly became interested in basing my most frequently used applications as much as possible on Cocoa. More of all that another time, though.

It’s easy to dismiss SubEthaEdit as being purely a remarkable experiment in realtime collaborative editing, but as a straightforward, lean editing environment, it’s fabulous — and it’s the only editor on Mac OS X that features blockediting, just like my old faithful StrongED (gawd bless it) on the Acorn. The new version adds some features that were nonetheless sorely lacking in version 1: regex-based search and replace, and a split-screen viewing option.

For commercial use, there’s now a £21 (or thereabouts) price tag, which I don’t consider at all unreasonable, especially for an application that forms such a strong part of my life with OS X.

Fri, 14 May 2004

Love From Colditz

One of the many things I love about living in Greenwich is the abundance of filming that goes on here, from the sublime to the ridiculous.

Mike Leigh’s All Or Nothing was filmed all around the borough, with the taxi cab office set in the room where I’m sitting right now. This was a year or two before I bought the place, so I can hardly brag, but it was weird to go and see the film and plainly recognise my studio (and, for that matter, home) on the big screen. A few weeks ago, I got a phone call from the location scout for the forthcoming The Yank. They were looking to film a short scene in a web design studio, and since other scenes were being filmed a couple of doors down, were interested to find out if my studio would fit the bill. Sadly, the studio is plainly designed around the needs of one person, and so I don’t think they felt it was large enough. Either that, or it smelled bad, because they never came to do the filming.

Anyway, it seems that something (I know not whether it’s a film or TV production) entitled Love From Colditz is being filmed down the road in the Naval College right now. There’s a odd set of fake castle walls alongside the authentic enlightenment ones, a collection of wartime props (including old bikes, a phonebox and a set of bollards), together with two large piles of sandbags close to the entrance to the restaurant.

Marvellous.

Mon, 10 May 2004

Web Safe Colours

Todd Dominey writes about the creation of his Scribe theme for the new-look Blogger. It’s always fascinating to read about the design process — the inspiration, the decision-making, and the technical resolution.

But of particular interest to me was Todd’s comment that:

On most screens it looks balanced with a pleasant level of contrast, while on others it looks a little muddy. But that’s the nature of using a non web-safe, limited color palette with a delicate visual balance.

Well, I’d like to write in support of non web-safe colour palettes and delicate visual balances. The accepted wisdom that a design must rigidly adhere to the strict limitations of the web-safe palette strikes me as unnecessary and outdated. I’m not sure what the statistic is for the percentage of people browsing in only 256 colours, but I suspect that it isn’t an overwhelming majority. A sizable number of the colours in the web-safe palette are garish and frankly unusable. Why not accept that times have moved on, and that it’s now acceptable to use subtler hues and gentler transitions?

To my mind, it’s another aspect of keeping a site accessible. Now, I’m aware that it might seem as if those two arguments are contradictory, but let’s briefly run through the ideals of accessible visual design:

This last is accessibility in a nutshell, as far as I’m concerned. So what if a subtle shade of cream I’ve used turns out grey in 256 colours, or is dithered? It’s not as pretty and it’s not how I intended the design to look — but it’s still perfectly accessible. If I were to pick a text colour and background colour that rendered to the same shade in 256 colours, I’d have used too damn low a contrast anyway. If a pattern of dithering made the text hard to read, chances are the text would be too small.

For my money, Scribe looks completely lovely. Hurrah for using colours that look good, and knickers to their ‘safety’.

Fri, 07 May 2004

Visual editing

Bricolage 1.8.0 has been released, and brings with it support for visual content editing using InteractiveTools’ HTMLArea.

This makes me feel better about using HTMLArea in the CMS I developed for Woking. One of the principle complaints they had with respect to the old, rudimentary, nearly-a-CMS-but-not-quite I had developed for them was that editing text content was too syntax-heavy and onerous.

The system’s still in use at the NPC site, where they seem to soldier on with it quite happily. But it was plainly inadequate for the Council, who needed to be able to offer more sophisticated markup (tables, imagemaps and so forth) in their pages than the limited structure achievable this way would allow. So, we ended up with a schism where some pages used the Markdown-alike system, and others simply contained fragments of raw HTML. (Unlike Markdown, you couldn’t mix the two.)

When it came time to draw up the specification for the proper CMS, a visual editing environment was therefore a serious priority. A customised version of HTMLArea fit the bill quite nicely.

My cap is doffed to those who work with JavaScript on a regular basis. I found the customisation of HTMLArea a world of pain — not because of any inadequacy in HTMLArea itself, but because I was trying to play it all by ear. It took ages for me to get my head round the Internet Explorer API which allows the live editing of HTML within the browser (in fact, I’m not at all sure I ever really did), and I found the experience of trying to adequately pass contextual information to dialog windows a source of distilled frustration.

Why all the customisation? Well, I wanted to be able to add features unique to the CMS right into the visual editing experience — embedding live content from another source, allowing for timed content (a span or block that is timed to disappear after a given date), a firm distinction between internal links, external links, links that open popup windows and so on.

All of this is stored either as a custom URI scheme that then gets translated on page delivery, or as custom namespace-qualified tags, all wrapped up in an XHTML (Transitional — one thing at a time!) document within the database. (I’d call it the content repository to sound authentic, but really it’s just a big “textcontent” table).

Which led to another problem: HTMLArea and the Internet Explorer API allow the live editing of HTML content, not XHTML or generalised XML content — so the CMS uses a pair of highly specialised XSLT stylesheets to translate between the XHTML + CMS tags content in the database to the vanilla HTML content for the visual editor, and back again, generally using specific stylesheet properties to indicate all the custom stuff that’s going on.

In hindsight, XOpus would have saved me a vast amount of that bother, but it possibly goes too far in the other direction, and the pricing structure is resolutely based on the number of users and the CMS resolutely isn’t. I also feel it would have involved a touch too much of a learning curve for the users of the system. All the background pain and kludgery that the CMS goes through, ultimately, stops the users from having to worry about it.

Incidentally, isn’t Markdown just a stroke of genius as a name for a system like that? I’d settled grudgingly on plaintext formatting. More doffing chez Green.

Thu, 06 May 2004

I’ll have 6,000 chicken vadgidas…

I do a lot of my work for an organisation entitled FISITA. It’s a French acronym, and they’re basically an umbrella group for automotive engineering societies across the globe.

But you do always have to carefully spell their name. Being in the automotive industry, they get called Fiesta a lot, and Ian and I often refer to them as Fajita. But, today, I had cause to mention them to a new contact, who remarked, “that sounds a bit like Fistula, doesn’t it?”

Jiminy. Well, between you and me, I think the car shenanigans is a smokescreen — really, they’re FIST AI, a guerilla group dedicated to overthrowing the mounting threat of neural network global domination.

You heard it here first.

A plea for clementines

From decaffeinated:

There are now three sites on the web that make good use of Safari’s text-shadow support (where “good” is the opposite of “spurious” and “ill-conceived”): Automatic, Bryan Bell’s aptly-named bryanbell.com and Les Orchard’s 0xDECAFBAD.

…can we extend this to “sites that shamelessly rip off bryanbell.com’s design”? I mean, I know I’ve only just started this site and it has an audience of me, my missus, and possibly Girlcat if I hold her in front of the screen — but frankly I saw Mr. Bell’s recent design and decided that I had to have bits of it for myself. And I think it looks good, really I do.

What’s in a name?

So after probably a full year of dithering, I finally start my blog. I’ve had this domain kicking about for even longer than that, originally with the intent of creating a site around a set of short stories I had in mind, entitled Elements or Lower. The idea of the stories was that each would explore a different topic fundamental to what makes a person a person — memory, sensory perception, self-identity and so on. I never got round to writing more than a couple, even though the idea for the series of stories is around ten years old now.

Unbelievably, that’s not even me at my most pretentious. Go back a further five or six years, to being a teenager, and we get into some truly awful territory. I had the opportunity to look back through my GCSE English coursework a few weeks ago, where I’d submitted a fairly substantial wad of creative writing I’d undertaken with sincere self-belief and an alternating obsession between the basis of a novel exploring the mythology of Ancient Egypt and (even more worryingly) a set of songs I’d written with my friend James that had no small basis in, of all things, science fiction. James was responsible for the music, and the couple of songs he wrote himself in the same period are just fine. He is therefore exonerated entirely from the project. I pity the poor examiners for having to trudge through it all. Part of me wonders if they saw the coursework folder and gave me a mark purely on the basis of volume.

Anyway, I finally had to admit that I wasn’t about to ever make use of this domain for its original intent, and felt that I might be able to commandeer it for a blog. I wanted to start around a year ago, at the start of my project at work to expand upon my Content Management System for Woking Borough Council. I thought it would be interesting to document the design and implementation process of their new, bespoke CMS.

It would’ve been, an’ all. Each small decision turned into a big can of worms, and I’d like to visit some of them in retrospect over the coming weeks. But I didn’t start blogging it all when I’d wanted because I’d started the actual job, and I thought at the time that when I started a blog, I’d have liked to have written the software that runs it as well. Having now finished the CMS project (more or less), I’ve now reconsidered the wisdom of writing one’s own blogging software as well.

So I decided that, as a member of the Perl faithful, I should redress the overbalance of laziness and hubris with a touch more impatience, and just get on with it at last.

Hello!