Logic’s Last Stand

July 15, 2010

Some Things Change

Filed under: Computers, Gaming, Life, Movies, Music, Philosophy — Zurahn @ 7:31 pm

For a long time now I’ve felt like a living contradiction. Everything I used to think about myself has been inverted, and everything I currently think about myself include both ends of the spectrum. Brilliance and idiocy, joyful and sorrowful, sincere and flippant, superior and inferior. There have been some constants, but those appear to be dying.

The latest to fall is probably for the best. I’ve played up my own negativity on things, as I do tend to focus on what the problems are. I think it plays into programming, as handling exceptions is necessary, so picking apart the little things is part of the job. But in a general sense, I’m so sick of the negativity.

It’s one of the best things we had going for us for a long time at The VG Press — the criticisms may have been legitimate when we had them, but they were in good humour. Yet, nowhere’s perfect; here and moreso the Internet at large is creating a great big bastion of hate. I’ll mention up-front that I’m not referring to factual matters; those who, for example, rally against vaccination are doing enormous harm and deserve to be vehemently shot down. It’s the realm of significant subjectivity. It doesn’t have to be videogames and it doesn’t have to be personal; any area where there’s room for reasonable disagreement, there are plenty to take it as an absolutist position.

If there’s a criticism, it’s not enough to just bring it up in the appropriate context, or if as a reaction, to expand on it. With anything and everything, there are some to try to ruin it for everyone else. It also gets worse, as it does spread to personal attacks by relation. Those who support X are amateurish, or any number of other insults for no other reason than a difference of opinion.

So I’m done. Keep it to yourselves, I don’t want it destroying me from the inside out. If you want to berate people for playing “casual” games, or PHP developers as not real programmers, or country music fans as hicks or any other selfish, outwardly hateful, spiteful and utterly immature positions, that’s your prerogative, but you’re not going to ruin it for the rest of us. You’re not going to ruin it for me.


June 5, 2010

Accessible Forms with PHP and jQuery

Filed under: Computers, Freeware — Tags: , , , , — Zurahn @ 8:30 pm

A primary challenge of recent web-development is how to make use of the great new dynamic tools provided to us in using libraries such as jQuery, while still providing an accessible website without the use of JavaScript. While there’s more to accessibility than making a site work without JavaScript, it’s a fundamental start. This task looks arduous, but it doesn’t have to be — approach it right from the start and it may actually be trivial.

Let’s start from the no-JavaScript version and work up. Normally your form without JavaScript would look something like this

<form method="post" action="example.php">
    Field: <input type="text" name="field" /><br />
    <input type="submit" value="Submit" />

To have JavaScript handle the post, we’ll add an onsubmit function to the form.

<form method="post" action="example.php" onsubmit="return formSubmit(this)">
    Field: <input type="text" name="field" /><br />
    <input type="submit" value="Submit" />

When the onsubmit function returns false, the form does not submit. So by having the formSubmit function return false, we can have the page handle the post via AJAX instead of having to refresh the page. Let’s look at the formSubmit function.

function formSubmit(obj)
    var form = $(obj);
    $.post(obj.action, form.serialize());
    return false;

The .serialize() function takes the form elements and converts them to query string parameters so they can be passed through post. By using this, we can reuse the same generic formSubmit function regardless of the form — all we have to add is the onsubmit attribute to the form.

Now you may also want to have error and success messages return. A good way to handle this is via JSON objects. JSON is a standard by which objects can be represented in string form, so we can pass a string from PHP to JavaScript, which can then be handled as an object. Let’s update out formSubmit function to handle this behaviour (the script will assume that there are hidden divs with the ID “error” and “success”.

function formSubmit(obj)
    var form = $(obj);
    $.post(obj.action, form.serialize(), function(data)
        // Return data is JSON object string, so eval to get object
        var message = eval("("+data+")");
    return false;

function showErrors(messages)
    if(typeof messages != "undefined")
        $('#success').css('display', 'none');
        var error = $('#error');
        error.css('display', 'none');

function showSuccesses(messages)
    if(typeof messages != "undefined")
        $('#error').css('display', 'none');
        var success = $('#success');
        success.css('display', 'none');

function getMessageList(messages)
    var output = '<ul>';
    // iterate through the object properties
    for(i in messages)
        output += '<li>'+messages[i]+'</li>';
    output += '</ul>';
    return output;

Now we need to construct the JSON object on the PHP side. The error will be echoed, but remember that we want this to work even if it’s not an AJAX post, so we need different behaviour depending on whether or not it was an AJAX post — echo error/success messages if AJAX, redirect back if it’s not. Let’s go to example.php; the script will assume that the value $_SESSION[‘page’] holds the value of $_SERVER[‘PHP_SELF’] before the post.

$field = $_POST['field'];

if(!isset($field) || $field === "")
    Reporting::setError("Name cannot be blank");

    /* Do something with $field */
    Reporting::setSuccess("Operation with <em>$field</em> completed successfully");


class Reporting
    public function __construct()


    public static function endDo()
        if(isset($_SERVER['HTTP_X_REQUESTED_WITH']) && strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) === 'xmlhttprequest')
                echo self::getJsonErrors();
            else if(self::hasSuccesses())
                echo self::getJsonSuccesses();
            header('Location: '.$_SESSION['page']);

    public static function hasErrors()
        return isset($_SESSION['errors'][0]);

    public static function hasSuccesses()
        return isset($_SESSION['successes'][0]);

    public static function setError($message)
        $_SESSION['errors'][] = $message;

    public static function setSuccess($message)
        $_SESSION['successes'][] = $message;
    public static function getJsonErrors($clear=true)
        return self::getJsonMessages('errors', $clear);

    public static function getJsonSuccesses($clear=true)
        return self::getJsonMessages('successes', $clear);

    public static function showErrors($clear=true)
        return self::showMessages('errors', $clear);

    public static function showSuccesses($clear=true)
        return self::showMessages('successes', $clear);

    private static function showMessages($type, $clear)
        $output = '<ul>';
        foreach($_SESSION[$type] as $val)
            $output .= "<li>$val</li>";
        $output .= '</ul>';
            $_SESSION[$type] = array();
        return $output;

    private static function getJsonMessages($type, $clear)
        $output = '{ '.$type.': { ';
        $comma = '';
        foreach($_SESSION[$type] as $key => $val)
            $output .= $comma.$key.': "'.$val.'"';
            $comma = ', ';
        $output .= ' } }';
            $_SESSION[$type] = array();
        return $output;

Looks like a lot of work, but with that, we’re all done. Everything from now on is handled identically between jQuery and non-JavaScript versions of the site, and all you have to do is add onsubmit=”return formSubmit(this)” to each form, and in the processing script, close with Reporting::endDo(). Everything else takes care of itself.

May 23, 2010

Idle Threats – Wireless Security Misconceptions

Filed under: Computers — Tags: , , , , , — Zurahn @ 7:46 pm

There has been a lot of confusion for a long time on the relative effectiveness of wireless security, even among the otherwise tech-savvy crowd. Last week, the other developers at work got on a conversation relating to wireless, and the typical dismissal of WPA and WPA2 was thrown into the mix as well. Simply put, this is wrong.

This attitude likely stems from the genuinely broken Wireless Equivalent Privacy (WEP) standard. WEP uses the RC4 cipher, which over time has had an increasing number of weaknesses found, but that’s not really the primary problem with WEP. RC4 is a stream cipher, so it requires an initialization vector in order to produce proper pseudo-random results. WEP’s initialization vector is too short, and not sufficiently random, and this is the source of the most successful attacks. WEP can be cracked in a matter of a couple minutes on an active wireless connection.

Wi-Fi Protected Access (WPA) is a protocol created to address the critical weakness in WEP. WPA required the use of the TKIP protocol for encryption, while WPA2 refers to WPA with the use of CCMP with AES for encryption. There have been proof-of-concept attacks against certain configurations with WPA with TKIP due to somewhat similar issues as WEP, but to a much lesser extent. The short conclusion is that with insufficiently short key renewal times, a connection with TKIP could potentially be broken in about 12 minutes. Set a key renewal time of less than 12 minutes, and there is no issue.

Meanwhile, WPA with AES encryption (WPA2) has had no such proof-of-concept attacks and remains, with a sufficient password, is perfectly safe.

There is also some confusion as to the nature of personal versus enterprise, as if having the distinction means one of them is insufficient. Enterprise is there for the use of an authentication server (RADIUS) such that user account-specific certificates are distributed. It’s irrelevant to the home or small business user, and it’s not a concern for safety.

Perhaps it’s a bit naive, but I do believe we can, with dedication and sacrifice, keep our mouths shut unless we know what we’re talking about. Someday, someday…

May 20, 2010

Passion is Overrated

Filed under: Computers, Gaming, Life, Philosophy — Tags: , , — Zurahn @ 8:30 pm

Lately I’ve been thinking more about myself in terms of how I approach my interests. For varying reasons, my gaming has become more and more sporadic, and have been finding myself in a consistent mood for JRPGs. I may not find them the most well executed games most often, but they’ve been the most enjoyable for what I’ve been looking for.

On the daily life side of things, programming has been something I do because I like the way I’m able to create with it, and how I’m able to solve problems. There’s a sense of satisfaction that comes from, say, adding a new feature to The VG Press, or cutting a two hour job down to a two minute one with a clever script.

SteelAttack sparked the thoughts with this post,

SteelAttack said:

What pains me to an extent is watching people like you guys, who I have grown to care about and appreciate, get somewhat worked up because of statements like these, giving them credence when they don’t deserve it, and generally considering them journalists just because they happen to write about games.

By focusing on the conviction of our responses, it highlighted how little conviction I really had. Videogames, more than ever, have become a source of relaxation. I’ve come to have more passion about the community than the games, because that’s where the energy is.

In programming, there is a consistent theme of how programmers have to be passionate about their trade. If you’re not passionate about programming, you need to get out of the field! It’s no an uncommon train of thought, that it has to be your world to succeed.

Simply put, though, I don’t want passion. I’ve had problems with stress for a long time. Pressure situations, though I don’t think it translates to outward appearance, are too much. I burned out on chess in the same way, and simply put, it doesn’t make you any better. You can have all the passion in the world for programming and still be a lousy employee; you can love games more than anyone and still be miserable to be around; you can dedicate your life to one cause and get absolutely no where.

Giving an honest effort, certainly. But I find I’m doing just fine with laid back old me. I don’t think I have many detractors at The VG Press, I can still be as happy as ever playing Sakura Wars, and I seem to be getting pretty consistent praise at work for doing what I considered par the course. Passion? I passionately deny it.

April 17, 2010

Vim and Emacs – When Programmers Design

Filed under: Computers, Freeware — Tags: , , , — Zurahn @ 11:06 pm

There are various immortal battles of loyalty — Coke versus Pepsi, Kirk versus Picard, McDonald’s versus Burger King — but the one closest to programmers’ hearts is Vim versus emacs. Vim and emacs are text editors dating way back. Vim (as in VI Improved) is a “newer” version (as in only ~20 years old) of VI, which was originally written for BSD UNIX in 1976 and is a staple of terminal applications. Emacs is the brainchild of open-source pioneer Richard Stallman that same year.

Why are these crusty old terminal programs so near-and-dear to coders? The short answer is that they work. What they don’t do, however, is make themselves accessible. I’ve been jumping back and forth between both over the past few weeks, and simply put, this is what happens when you have programmers design software, and not just write it.

What I mean by that is there’s no denying the power of both editors. There’s a reason they’re still used, still maintained, and still invaluable. That said, there is simply no concern with making the applications intuitive, user-friendly, or out-of-the-box ready. If you’re going to make significant use of either, not only will it require significant time investment and the effort of learning not only the basics of how to use them, but the features that make them worthwhile, but also the investment of configuring the software to actually be practical for what you need.

The majority of my programming of late is with PHP-based websites. This means writing PHP mixed with HTML, as well as CSS and JavaScript files. Neither do this properly out of the box, and frankly, there isn’t a good solution for PHP mixed with HTML in either. There are relatively poorly performing add-ons that do the job, but not as well as they should.

One could argue that hey, don’t bloat the software. I may want that, but not everyone will. I’m not one for bloat, but there is consideration for convenience versus leanness. It’s part of the reason I use Opera over Firefox — it’s built-in, with no other maintenance required. Given the scope and use of Vim, I don’t think it’s an egregious request that it have more universal options out-of-the-box. Emacs is a 40MB download, so I hardly think fleshing out of the language support is too much to ask, either.

Let’s be realistic here. It’s programmers using this software. Why on earth do these programs have syntax highlighting disabled by default? Why are line-numbers disabled by default? Why does it require additional configuration, even downloads to get highlighting for popular languages?

We’re programmers, we fiddle, but that isn’t an excuse to leave software in a unfriendly state because you feel justified in saying “deal with it.” I do, and will continue, to use both programs; programmers handling design also means some awesome features — so long as you can find them.

April 15, 2010

GUI Bloopers 2.0 – A brief review

Filed under: Computers, Writing — Tags: , , , — Zurahn @ 7:37 pm

GUI Bloopers 2.0 is an updated release with modern examples, but the principles remain largely the same. As detailed in the book itself, somehow the same interface problems are just timeless.

GUIs, or Graphical User Interfaces, are how most people know computers. It’s the human part of everyday computing that makes things accessible to the average person. We programmers aren’t exactly the average user, so it’s not uncommon to get great programs with terrible usability, spelling doom for the product.

While dry at times, it is a thorough guide to pointing out where design goes wrong and how to avoid it. It’s contrastable to Steve Krug’s “Don’t Make Me Think” which is brief and effective in its reasoning about how to approach interfaces, GUI Bloopers 2.0 covers every angle in detail, and by that in some ways does a better job at enlightening the developer.

I may not be able to call it a definitive guide, but it’s one area where I would in fact say that something similar, if not this, should be required reading. It’s not just a matter of GUIs, but how you think about design in general. That is to say, we’re not programming for ourselves, we’re programming for the user — we’re creating for people.

Coming in at 372 pages (not counting appendices), outside the field, for the most part it’ll read like a textbook. For those in a position relevant, though, it’s got my resounding support.

Take a look at the world from the perspective of usability, and you’ll find it hard to go back.

April 9, 2010

E-mail Security – Still an Issue

Filed under: Computers — Tags: , , , , , , — Zurahn @ 11:29 pm

While scam e-mails are often fairly easy to spot, it’s not a given that a ruse is going to be executed in text-only, poorly written english by a supposed Nigerian prince. Malware is big business these days, and e-mail scams are getting more professional. Take this example as detailed by Panda Labs. From start to finish it’s a perfect impersonation of an IRS e-mail, insofar as it can be for requesting detail that the IRS would not request over e-mail.

The main focus I’d like to address is that while on the Internet you can look at a web-address, and if it’s over TLS/SSL, you essentially have a guarantee that it’s the site you requested (if it says https://mail.google.com you known you’re at Gmail, for example), the same is not true of e-mail. It’s not uncommon for scam e-mails to just send from a random free e-mail account, but that’s the equivalent of the blatant Nigerian 419 scam. More professionally an e-mail can say it’s from irs.com, whitehouse.gov, bankofamerica.com or anywhere else. E-mail headers are spoofable. You can change any of that information at will, by design. For example, take a look at how easy it is to do in PHP

$to      = 'sucker@gmail.com';
$subject = 'Your account information';
$message = "Don't be a pussy, we're legit";
$headers = 'From: support@wellsfargo.com' . "\r\n" .
	   'Reply-To: support@wellsfargo.com' . "\r\n";

mail($to, $subject, $message, $headers);

As simply as that, you can send out an e-mail with false information in the header. There’s no point at which the headers are verified for consistency due to the nature of the protocol. They’re a convenience, not an assurance. The reason this does not work with websites is twofold. In general, the server does not request you, you request it. Someone can’t simply spoof a server response and send it to you anytime. It is possible, though, to act as a man-in-the-middle and intercept your request to a website and respond fraudulently. This is a genuine concern at public Wi-Fi hotspots, and less-so on private wired networks.

The protection to this is the aforementioned TLS/SSL which is an encrypted connection. The connection is negotiated based on IP address and a certificate verified by a root authority. For example, say Google gets a certificate from VeriSign at IP address If a man-in-the-middle attempts to intercept the request, the TLS connection will fail and you’ll get a warning. IP addresses cannot be spoofed for a TCP/IP connection (though they can be spoofed in individual packets) due to what’s referred to as a three way handshake. When you request a connection to a web server, you send a SYN packet containing your IP address. The server subsequently responds to that packet with a SYN-ACK packet, sent to the IP address in the SYN packet — if it is spoofed, it will go to the wrong address and you won’t receive it. This will cause the connection to fail, because in order to complete the transaction, you must respond to that specific SYN-ACK packet with a SYN-ACK-ACK packet to confirm the connection.

Why not TLS for e-mail? Well, again, web-pages are received upon request, whereas e-mails are unsolicited; however, there is the option of assymetric e-mail signing such as OpenPGP which allows you to encrypt or sign your e-mails with a signature verifying the identity of the sender. This is not part of the e-mail specification and limited in scope of use thereby limited its usefulness.

Furthermore, it’s not just a matter of responding to e-mails that is a problem, it’s doing anything with the contents therein, including following links or downloading attachments. In great part by Adobe’s poor security practices combined with the profileration of their products, malicious PDF and Flash documents have become a primary vector of attack. There have been a slew of vulnerabilities in Adobe Reader (and FoxIt reader oft also affected) of late that allow execution of arbitrary code, meaning just by opening a PDF document, you can be infected. This is largely, though not entirely, thanks to Adobe’s ridiculously stupid idea to include JavaScript as a part of PDF files, meaning you can have scripting in PDFs. You can, and should, disable this in your settings (or better yet, not use Adobe Reader. Ever.).

Flash is embedded in web-pages, and will continue to be a larger attack target as time progresses. Following links can lead you to sites with malicious Flash files embedded, again meaning arbitrary code could potentially be executed without your knowledge or approval. If you’re not on the latest version, the more likely there is for an issue.

The attack vector isn’t necessarily just some random person, either. As detailed in a recent Security Now episode, just because it’s from someone you know doesn’t mean you can trust it, even if the header is not spoofed. One of the most lucrative assets a blackhat can get his hands on is an e-mail address. If someone gets into the e-mail of a friend or family, he can then use the information there to try to coerce you into sending money. Their life is contained in that e-mail account, likely years worth of personal information; from that, it isn’t difficult to be convincing.

Lastly, e-mails themselves can include HTML and display like webpages. In Outlook, for example, you get a preview pane such that if you just click the title of an e-mail, the contents are displayed immediately below. Convenient, but a terrible idea. Again, web-pages can contain exploits. You can, and should, switch to text-only display of e-mails.

It’s almost a cliche to warn about e-mail and the dangers of attachments, but it’s more of a problem now than ever. It’s not just don’t download and run .exe attachments anymore.

E-mail safety in summary:

  • Do not trust any unexpected e-mail
  • Do not trust e-mail headers
  • Switch to text-only mode in e-mail
  • Verify via other means any request for money, even if you know the person
  • Never open unexpected attachments
  • Do not open spam
  • If you open spam, do not reply

All that, or, you know, switch to Linux.

January 6, 2010

An Ugly Side of Web Development and Design

Filed under: Computers — Tags: , , , — Zurahn @ 11:02 pm

I’ve got time to kill, so indulge my ramblings.

For those who don’t know the world of web development, it’s probably doesn’t work the way you’d think. There are three aspects of web development: Presentation, Logic and Data. Presentation is HTML, CSS, JavaScript — anything that’s run on the client computer; logic is the server side deciding what it needs to do when you visit a page, and data is whatever’s stored on that server to actually provide the user.

The tricky thing here is that you can call anything done on th server-side programming, and JavaScript is a programming language, but the primary aspects on the presentation side — HTML and CSS — aren’t programming languages; they are markup languages. Browsers follow rules that say how the HTML and CSS are supposed to look when used in specific ways, but sometimes those rules are ambiguous, sometimes the browser is wrong, sometimes the action is unspecified, and sometimes the browser just hasn’t implemented the functionality.

This leads to some weird and unique problems and solutions. For example, there is a generic container element in HTML called a DIV tag. You can put anything you want in a DIV, and you can use CSS to style it so it has, say, a black background and white text.

Well, if you say “height: 100%”, that doesn’t work as cleanly as one would hope. If you take a look at The VG Press forum, you’ll see the user information in a sidebar on the left with a darker grey background and a border to the right. Because you can’t just set the height of the sidebar to 100%, how do you get a perfectly flush border and background like that? You have to do a clever, silly and stupid workaround wherein the DIV that contains the sidebar DIV is given a background IMAGE containing both the background colour and border, repeated vertically only. Weird, but it works.

Let the chain reaction begin, though, as a big feature I’m working on is full customization of presentational aspects including colours. Remember that the sidebar background and border is an image, so you can’t just change the colour — what then?

After several weeks of not knowing how to solve this, I now have a solution. In PHP (a server-side programming language), you can create images. Thusly, how I could get around this would be to generate a new sidebar image with the colours specified and cache the image on the server. So if then user specifies a sidebar background colour of #FFFFFF (white) and a border colour of #FF0000 (red), when the user saves his settings, the server receives the request, generates the image, and saves in on the server as FFFFFF-FF0000.png. Then if another user chooses the same colour combination, that file is used, otherwise a new one is generated. Automatic, transparent, efficient and completely obtuse. Gotta love it.

December 13, 2009

Fixing Ugly Web Fonts in Ubuntu

Filed under: Computers, Freeware — Tags: , , , , , , , — Zurahn @ 9:00 pm

In Ubuntu and its related distributions, the font rendering has gotten pretty good so that you don’t have to do much tweaking. However, there may be some nagging ugliness while web browsing if you prefer sharp fonts, and no matter how much you adjust the settings, it’s not going to get fixed. It’s not because of the web browser or desktop settings.

So, what’s going on here? Well, it turns out Ubuntu by default has versions of typical Microsoft fonts installed, such as Arial, Verdana, Courier, etc. and they all look like crap. For whatever reason, they do not alias properly and end up with ugly blurry asymmetric edges. By just getting rid of these fonts, everything immediately prettifies by falling back on default free serif or sans serif fonts.

What we’ll do is move the font folder containing the fonts to a backup folder in your home directory.

sudo mv /usr/share/fonts/truetype/msttcorefonts /home/[USERNAME]/backup

Restart your web browser and behold a much prettier Internet.

December 7, 2009

The Linux Wanderer

Filed under: Computers, Freeware — Tags: , , , — Zurahn @ 12:06 am

Once again, I have moved to another distribution of Linux. I had last installed Ubuntu 9.10 and the Kubuntu packages, but it was time to move on. Why, exactly? Well, that’s that thing: I don’t know. That is, I don’t know what the problem was, but I sure couldn’t fix it.

Despite even inquiring on Superuser, the inexplicable issue of sudden reversions in files never did cease, despite multiple harddrives, filesystem checks, startup variations, etc. Concluding with a sudden and inexplicable loss of microphone functionality, it was time to call it quits.

So I have since given Xubuntu a go with the ext4 filesystem (in lieu of reiserFS which I had been using previously). While no doubt attributed by just my own greater experience, this has been the smoothest transition yet. Everything has just worked, and very quickly to boot.

Xfce Desktop

Xubuntu is based on the Xfce desktop environment, which is meant to be a lightweight alternative to Gnome and KDE. It’s not so trimmed down as some others, though, and works as something of a midrange solution. It provides simple modularity and customization.

Unlike Gnome, the task manager on the panel works vertically, and compared to KDE I’ve found it particularly snappy and responsive. While previously it would have felt like a step back, it manages to now be fully functional, though perhaps missing a little flair.

Here’s hoping this one can stand the test of time.

Older Posts »

Create a free website or blog at WordPress.com.