Slow Web

While combing through my WordPress dashboard this morning I was pointed by Matt in the direction of Jack Cheng’s article on The Slow Web.

The concept really captured my imagination; struggling in the present day and age to keep up with the ever increasing demand both to consume and generate online content I regularly find myself simply desiring escape rather than engagement as the agony of deciding often seems worse then the effort of doing.

By adopting a “batch, refine and consume at leisure” approach it genuinely seems that not only will the pressure to keep up dwindle but that quality and enjoyment will rise at the same time.

Take blogging. I’ve posted in the past about getting my blogging back on track but evidently this hasn’t happened. It’s not for lack of material, or indeed motivation, but the simply overwhelming number of possibilities and equivalent perceived value attached to each prevent suitable desire being channelled into the generation of even one finished item.

Speaking at WordCamp in Manchester three years ago I proposed a solution to this which, in retrospect, I believe was wrong. I stated that as the web sped up, our time available to each task was smaller and thus we should leverage the fast tools available to us (phones, cameras, GPS tracking) and push or pull all of this content automatically into a location (such as a blog) that used to require time, labour and love to maintain, thereby bolstering our web presence and preventing personal burn out.

The sad thing is, this approach is a fraud, a cheap imitation. After all, the nature lover desires not to merely see the sunset and the fact I was there but to catch a glimpse of the poetic thoughts that may have passed through my mind as I gazed upon the vista and compare them to his or her own.

I’m not sure these thoughts and ideas translate easily into a course of action but perhaps that’s the point; given the time and space to merely think, positive and decisive action will surely follow.

Comments off

It will not be a surprise to some that I have bought yet another domain name, but this new one is a little bit special.

When tweeting we’re all accustomed to the shortening of URLs, most commonly using, but in this day and age, personalisation is all the rage and can have important implications.

Lets say for example that someone posts a link to the BBC on twitter via Their service helpfully translates their URL so that it appears as followed by the short code. That way everyone knows, that when they click on the link, they will end up at a site owned and operated by the BBC.

As well as inspiring confidence to click the link, origin is also transferred through re-tweets. What do I mean by that? Well, say I post an article and tweet about it using a link. It is only once a user has clicked the link that they necessarily know it’s my article. Before that in the mind of the user, the article could in fact have originated from any twitter user who happens to have included the link in their tweet. Thus by having my own short code domain, my linked-to content can always be traced back to me, however a reader may have stumbled upon the link.

All well and good you might say, but does this really justify the expense? Well perhaps not, at least on it’s own. You see when you own and control a short code domain you can do clever things with it. For example, if I post a video on my site and link to it with a short code, the user must first traverse from short URL, to site, then in turn to video. What I can do if I own the short domain however is to make the short URL the actual permanent link for the content and save the user some hoop jumping. While I haven’t got a strategy for implementing this kind of thing yet, it’s certainly nice to have the option of doing so in the future.

The question you’re all asking then, why the .sh extension? Well, seeing as the .uk extension is not available for registration as a TLD I needed a suffix that both identified myself with the United Kingdom but also which was something related to me or my field of work; .sh does both of those things.

The domain suffix belongs to the island of Saint Helena, a British Overseas Territory and as an added bonus the .sh file extension is well known in the Linux and Unix world to denote a shell script which is quite possibly the most versatile way of accomplishing both manual and scheduled tasks on the platform. As an avid supporter of open source and long time Linux user, this fitted the bill perfectly.

So finally then, when you see a link, view it with trust – it will ALWAYS link you to a site under my control and you have my personal assurance that content served up from such sites will be safe. The root of the domain with no short code attached will form a permanent link to this article thereby explaining to the uninitiated who the short domain belongs to and proffering an explanation as to how and why it came into being.

Comments off    

Thought on Technology

I’ve owned this Nitin Sawhney track on CD for some time but as it’s good food for thought I’ve found the associated music video and posted it up here.

Comments off    

WordPress in Education

Always keen to post when I find someone who’s using my Calendar plugin for something big or exciting, this week I received an e-mail from Adam Scott informing me that Calendar had received a mention in his book, WordPress for Education. If you work in academia or have an interest in expanding the focus of your WordPress work, this book may just be worth a look.

Comments (1)    

Programmers vs Users

I was sent this amusing cartoon the other day, never have truer words been spoken!


Comments (2)    

Custom linux service at boot

So you’ve created a service in Linux, written a start/stop script for it, stored it in /etc/init.d/ and now you want it to actually run on boot/restart. This little line of code at a root terminal will do the trick for you.

update-rc.d <script> defaults 98 02

The <script> should be replaced with the file name of your start/stop script in /etc/init.d/ the 98 ensures it’s (likely) to be the last script to start and the 02 ensures it’s (likely) to be the first to stop.

Comments off    

Stalling File Transfer Over VPN

The other day I had cause to allow someone to access their corporate VPN over my home internet connection. After I had configured the appropriate pass-through settings for the IP address that my DHCP server had allocated their laptop they were able to connect easily. One issue remained however; whenever they checked a file out of sharepoint that was over 250Kb or so, the file transfer to their machine would stall and consequently crash the browser.

Google threw up all sorts of possibilities but the more things we tried (and that failed) the more I couldn’t escape from the idea that it must be some network related issue. I got to thinking about the nature of VPN and how there is an initial connection and then a large stream of UDP packets over the tunnel for the data transfer. That’s when a light bulb switched on – I’d seen something like that before in my Draytek router settings.

Draytek have a nice selection of anti denial of service features which I have activated to protect my network. Some of these concern certain types of flood defense; where a count of packets is maintained and if it exceeds a certain threshold then the connection will be dropped for a period of time. This would result in the appearance of a stall for any file transfer caught up in the melé. Bingo!

The culprit setting is shown in the screen shot below, “Enable UDP flood defense”. Originally this was set to 150, I had to set it to 1000 in order to eliminate that VPN issue.

Draytek DoS UDP Settings Screenshot

It’s worth discussing why the number has to be so high for my fix and why it may not need to be so high for you. Most home connections over VPN end up being below 5Mbit or so with the corporate end point on the other hand being capable of 100Mbit. This results in the overall speed obviously being tied to your 5Mbit speed. In my case I use the FTTC technology and thus have speeds up to 80Mbit. This means I can transfer many more packets per second than most people on home connections so it is likely that you won’t have to go so high as 1000 to get the desired result in the Draytek settings.

I hope this helps someone out, feel free to get in touch with me if you have any questions.

Comments (4)    

Photos too dark

I’ve been told by a number of people in recent times and have received a number of e-mails stating that the photos on my website are too dark, somewhat under exposed. This has always puzzled me as while the photos do look a little on the dark side, the exposure is almost always set correctly based on my camera’s light meter when I’m composing shots. This has almost made me believe that the light meter was faulty but looking at the photos on the screen on the back of the camera I can see that this is not the case. Time to delve deeper.

In order to publish my photos to the web I have to extract image data from the RAW files my camera generates and compress into a JPEG file that can be delivered to and opened by a web browser. To to this I make use of dcraw, a nifty little application authored by Dave Coffin. This will pass data out from the RAW file to be collected on stdout by a program such as cjpeg.

As I like to publish exactly what my camera has stored to the file (with a little compression for portability) I don’t use many of the options on dcraw, just invoking the ones that read the values from the file it’s self and use those. Looking down at the parameter list I suddenly wondered if perhaps I’d mistakenly used the wrong parameter somewhere, so I started to check each one I’d used against the documentation.

Everything checked out apart from one very small anomaly; -w was documented but -W was not. The documentation states that -w tells dcraw to use the white balance value stored in the RAW file when processing which is what I wanted but -W seemingly did nothing except it threw no errors when used so must have some function.

Plunging deeper into the documentation I found the bombshell I was looking for; -W is used for switching off the exposure compensation value contained within the file! I ran a quick test and changed to lower case while decoding a photo I’d taken recently and the effect was perfect, the correct brightness and also the correct white balance (it hadn’t been far wrong but was much better now).

I’ve now updated my script that I use to call dcraw to invoke the correct parameters and hopefully dark photos will be a thing of the past on my website. The thought of having to re-decode some 3000 odd photos that have already been uploaded doesn’t fill me with joy though…

Comments (5)    



Kind regards,

Kieran O’Shea BSc (Hons) MBCS
+44 (0)1133 508345

Comments off    

Akismet for Zenphoto Fixed

As a user of Zenphoto for my gallery of photographs I’m pleased to report that I’ve corrected an issue with the long standing Akismet plugin for the platform allowing it to work with the latest version of the application, thus restoring anti-spam functionality to comment forms on both my sites and any others that care to use the plugin.

Rather than re-post all the details here you can follow the discussion and download the fix from the Zenphoto forums.

Comments off    

Next entries » | « Previous entries