Archive for Computers & WWW


WordCamp UK is being held in Cardiff this year and on account of the fact I live relatively close by now and have always meant to go but never got around to, I’ve signed myself up. For those of you that don’t know, WordCamp is an informal gathering of bloggers and developers who use WordPress. There is an opportunity to attend talks, discuss projects and of course socialise with other technically minded individuals.

The scheduled talks look like they could prove very interesting and there is already a large group of people signed up, some of whom I have already exchanged words with over the internet on various occasions so it will be great to meet them in person. Matt Mullenweg is also attending and having the opportunity to thank him in person for creating WordPress will be really cool.

Comments (1)    

Facebook Username

Last night facebook launched a username option which allows visitors to the site to go straight to a profile by putting the username in the URL. I of course made sure I snagged my username as soon as the system went live, so you can now access my facebook profile (if I have granted you access to it of course), on the following URL

Sadly I wasn’t able to snag “kieran” for a username as it had already been taken – Kieran Cloonan I’m looking at you – grrr! I could have had “oshea” but having that as a username seemed to bring back distant memories of being called by my surname at school so I decided against it. The username “oshea” has in fact now been taken so Ben O’Shea, I wish you the best of luck with it.

All in all I think this is a good move by facebook. While some might criticise it and say it’s just another step towards becoming MySpace, I see it as more of a step towards the modern web. PHP files and arguments in the URL are so last century. Pretty permalinks and restful behaviour are what it is all about these days. While I can’t see facebook truly implementing a restful URL structure they have certainly made an important forward step with usernames.

Comments off    

Problem with lftp solved

I thought I’d share this little solution with people as its been bugging me for ages. My command line ftp client of choice, lftp, recently started not connecting. I couldn’t find a suitable solution until I noticed that the AUTH command was being used when I normally don’t make use of that for ordinary ftp sessions.

I added the following line to my lftp.conf file:

set ftp:ssl-allow false

This fixed the issue of failing to connect and everything is now working as it should. Hope this helps someone who is in a similar fix.

Comments (2)    

Google change favorites icon

Regular Google users will have noticed that the favorites icon has recently changed from Old Google Favorites Logo to New Google Favorites Logo. Given how infrequently images and layouts related to Google’s home page have changed over the years, does this small change mean fundamental changes might be in the air at Google?

Comments (1)    

Webperf Issues

For the last few weeks or so, Webperf has been having back-end CGI issues. This has resulted in their usual colourful array of graphical statistics being missing in action. These are an invaluable resource for web hosts and similar to keep tabs on how they are doing compared to others in terms of their availability and bandwidth provisioning etc. Without it many people, myself included, have been getting frustrated.

If Webperf are not able to keep on top of the work required to run and maintain the site they should hand over to one of the many willing volunteers that have sprung up since the downtime. We need this site and this inaction is annoying and frustrating.

Comments off    

New Router

Today I setup a new router in our house, the Netgear DG834G. After carefully ensuring all existing settings were mirrored on the new device it was just a question of unplugging the cables from the old router and plugging them into the new one, like for like. It worked straight away with no noticeable problems.

I’ve known for years that I really should have replaced my old Origo (more modern requirements were taking their toll on its memory and other capabilities) but today, when I got an extra 100 Kb/s download speed simply by changing my router, I wish I had done it sooner!

I did hit one snag though in that my ability to connect to remote VPN networks using PPTP suddenly stopped working. After a bit of experimentation I realised that for some reason having DMZ enabled to any IP caused authentication to fail. Disabling DMZ allowed the connections to happen flawlessly.

Having to implement this work around didn’t fuss me as I don’t actually use DMZ, in fact it was merely pointed at a non-existent IP on my subnet in order to shield unused ports. Turns out that the Netgear doesn’t need such a thing to ensure all risky ports but the ones forwarded are hidden. Times have changed since the Origo days but old habits die hard!

Comments off    

phpBB3 release date announced

The phpBB group have have announced the final release date of phpBB3. This is an important milestone as there hasn’t been a new version released for 7 years and finally after much hard work from some and waiting around for many, its going to be here. This is a fantastic development and I feel sure that this advance will once again put phpBB far ahead of the game in the web based forum world.

Comments off    

Facebook Fresh

It can’t have escaped the attentions of those that read my blog that I don’t like facebook being littered with applications. Not that I dislike the idea of an API or the ability for developers to interact with facebook but I simply preferred the API when external applications were kept as just that, external to facebook and only visible on an external URL, not on the pages and profiles of facebook its self. The “clean” look of facebook was what made it appealing to me.

Given that facebook refuse to make it possible for individuals to disable viewing the stupid applications some people decide to add to their profiles leaving frustrated users like myself in a hell resembling MySpace (note my comparison to a network I did not join due to the awful looking profiles), someone else has stepped into the breech and produced a solution.

Meet Facebook Fresh (TM), a novel solution for all firefox users such that they never have to look at an application on facebook ever again. Screens will suddenly look as nice and neat as they did the day you joined the site. Refreshing? You bet.

Comments off    

Akismet Failing

It is without a doubt that one of the few anti-spam measures in place on my blog that has stopped me disabling comments is Akismet. The service catches literally thousands of spam comments on a daily basis. The problem is that while I used to get absolutely no spam through to my moderation queue such that all I had to decide was if I wanted a comment published, I’m now getting spam through as well and I’m having to mark various items as spam for submission to Akismet.

Critics will no doubt point out that this quantity of spam can’t be numerous and they’d be right, but as the solution didn’t previously let any through and Akismet is a continuously evolving anti-spam solution its worrying that its effectively getting worse over time as it means spam is evolving better than the defenses of Akismet.

Spam is without a doubt one of the biggest issues on the world wide web and while it hasn’t yet caused me to disable access to anything on my sites, if too much spam gets through to my blog moderation queue then I will be disabling comments. You might argue that disabling a feature is letting them win, but my take on it is that if I end up reading spam then I’m letting the spammers win because they have reached me as a reader. Likewise if there is a possibility that the comments might reach my blog then I’d be letting the spammers win by allowing the comments to reach a global audience.

Here’s hoping that Akismet doesn’t get any worse at blocking spam, or even gets better, as that way the comments can keep on flowing and I can be safe in the knowledge that I’m not in any way helping spammers reach their audience.

Comments (2)    

Two Tier Internet

There has been a lot of talk lately in the news about the potential for the implementation of a two tier internet, that is to say a global collection of networks in which some traffic is given priority over others, not necessarily for reasons of efficiency but for those of financial incentive – those that can afford to pay can prioritise their internet traffic or take advantage of less restrictive access to the content of others.

This is something I wanted to muse over a little before committing thoughts to my blog because it is something close to my heart. Having given it some thought though it is actually a less savoury prospect than I had ever imagined and I was never in favour of a two tier internet in the first place.

At present the internet is a place in which content is indiscriminately accessible to all; something I publish is available to a user in, say, Australia as readily as it is to someone who lives up my street. Likewise someone on a cheap ISP can also access my published content in exactly the same way as someone on a more expensive provider.

Two Tier Internet means that for the first time some publishers will be able to dictate which “class” of consumer will be able to access their content and what is worse even if the publisher intends to allow everyone equal access the ISP could be equally restrictive if it took their fancy. My internet might not be the same as your internet and this means that whole areas of the internet could be completely invisible to you without you even knowing it. Sure, you might be able to hit IP addresses but they would time out as if there was no machine responding if they weren’t on your tier.

I cannot stress enough how damaging this will be to the whole ethos of the internet. Tim Berners Lee (creator of the world wide web in case you didn’t know) said himself that the connections we use to share data should be freely accessible to all and that the whole way in which the internet works relies on us all being on one network in which we all have the potential to be equal players. If the very creator of the web intended us to have one internet and believes it would be damaging to split it up, why should legislators who have no technical knowledge have the right to say it shouldn’t be that way? Quite simply they are taking the piss.

I’m not going to jump on a high horse about the quality of the content I have to distribute to the world, especially since some of it is probably pretty crap in the eyes of some who might come across it, but I sure as hell don’t want to see someone else deciding who should see it or not based on who is lining their pockets. I put it online because I want everyone in the world with a connection to be able to read, listen to, watch or download it as they wish. If a day comes when I can no longer do that, the one place on earth where we are all still truly free, the internet, really will be dead and the world will be a much darker place.

Comments (5)    

Next entries » | « Previous entries