Elements or Lower

Wed, 08 Dec 2004

Catching Up

So I’m crap and haven’t posted in a month. The troubling thing about such a scenario is that things occur to you that you’d like to quickly post about, but a quick post seems altogether inadequate after such a dry spell. Do a proper entry first, I tell myself. And, since the proper entry seems too tall an order, nothing happens.

Well, enough. There’s sure been progress on the CMS front, which calls for a big list to catch up:

  1. The Woking site has moved to a new server. Still at Rackspace, but rather beefier and now behind a firewall. While migrating the site from my server to theirs, I thought I’d run what I knew would be a slow MySQL query on each — the database itself had by this point been copied across. The log table contains a couple of million records, and conducting a simple query on an unindexed column took 1 minute, 30.56 seconds to come up with the answer on my server, and a measly 8.21 seconds on theirs. Wow.

  2. We’re now using the LGCL rather than the APLAWS category list. The transition took about a week to sort out, including the creation of a new category chooser that not only presented the collapsible hierarchy just like the site’s Site Map, but also includes an XMLHttpRequest-driven LiveSearch and a tab with suggestions based on the words in the resource title. As a result of the switch, we’re also now incorporating real GCL categories for each page rather than just using “Local Government” across the board. I also think that the new LGCL Category Chooser may well be The Best In The Land, even if I do say so myself.

  3. The main page of the administration area now also uses the LiveSearch to help publishers track down resources. This has proven immensely popular (not least with me, since I’m often phoned about problems with specific resources, and navigating the collapsible sitemap is just clumsy enough to be annoying).

  4. I’ve released a small bugfix update to XML::Generator::RSS10::gcl and XML::Generator::RSS10::lgcl.

  5. I think I may have finally cracked a problem that’s been bugging me for months and months. Sometimes, despite the HTTP headers and meta tags in the page asking for data to be sent in ISO-8859-1, HTMLArea (or, rather Internet Explorer in the appropriate mode) sometimes sends data in Windows-1252, UTF-8, or a mix of the two. This results in weird encoding conversion errors that manifest as Ās and the like scattered throughout a page. I’ve been honing a routine using Encode::Guess and, initially, GNU Recode, but recode would occasionally and mysteriously elide large portions of text. I then discovered, by chance, the //TRANSLIT option to iconv. Consequently, and after some kludging to get the pound sign to work, I’m now hopeful. We’ll see if (again) I’ve spoken too soon.