Logic’s Last Stand

December 7, 2009

The Linux Wanderer

Filed under: Computers, Freeware — Tags: , , , — Zurahn @ 12:06 am

Once again, I have moved to another distribution of Linux. I had last installed Ubuntu 9.10 and the Kubuntu packages, but it was time to move on. Why, exactly? Well, that’s that thing: I don’t know. That is, I don’t know what the problem was, but I sure couldn’t fix it.

Despite even inquiring on Superuser, the inexplicable issue of sudden reversions in files never did cease, despite multiple harddrives, filesystem checks, startup variations, etc. Concluding with a sudden and inexplicable loss of microphone functionality, it was time to call it quits.

So I have since given Xubuntu a go with the ext4 filesystem (in lieu of reiserFS which I had been using previously). While no doubt attributed by just my own greater experience, this has been the smoothest transition yet. Everything has just worked, and very quickly to boot.

Xfce Desktop

Xubuntu is based on the Xfce desktop environment, which is meant to be a lightweight alternative to Gnome and KDE. It’s not so trimmed down as some others, though, and works as something of a midrange solution. It provides simple modularity and customization.

Unlike Gnome, the task manager on the panel works vertically, and compared to KDE I’ve found it particularly snappy and responsive. While previously it would have felt like a step back, it manages to now be fully functional, though perhaps missing a little flair.

Here’s hoping this one can stand the test of time.

Advertisements

November 5, 2009

Mandriva 2010 is a Disaster

Filed under: Computers, Freeware — Tags: , , , , — Zurahn @ 1:16 am

Perhaps feeling the need to meet the usual 6 month cycle of OS updates, Mandriva released its 2010 distibution for installation or upgrade. The transition from 2009.0 to 2009.1 was a nice improvement and I’ve been looking forward to 2010. However, my advice to all is to stay away from this one.

First up is the upgrade path. You can merely have the system download the updates and after a nice reboot, be all good to go with a fresh coat of paint. At least, that’s how it’s supposed to work, anyway. Upgrading from 2009.1 to 2010 seems to crash the upgrade program once it’s done, though if you go ahead and bewilderingly reboot and hope for the best, it will boot up as 2010.

From there, the system decided it would withing a minute or two have a kernal panic and lock up. Not a fluke nor a specific program, the system would just freeze arbitrarily.

Alright, let’s try the fresh install, then. Easy enough? Well the boot loader seems to lack the option to overwrite the MBR, so you may have a problem with dual-boot set-ups if you installed Windows beforehand. Additionally with a secondary harddrive like I have, you may have to adjust your BIOS settings. That said, after restarting a couple times to get past the pop-ups that load before the keyboard and mouse drivers, I did get in.

Not much new at the surface of things, aside from being less stable and some screwed up font aliasing. Not too terrible as long as you don’t reboot. If you do, the system may load, or it may get stuck on the neverending loading screen that fails to even load the option of a verbose mode to see what’s happening.

A second attempt at upgrading yielded the best results. No sudden freezing, so it’s at least usable for the first boot. That said, the system is unstable, with X siezing up with appearance changes, and upon reboot to fix, we hit our old friend the neverending loading screen.

If you have 2009.1 installed, if you’re able to easily restore from a back-up, you may want to give it a shot and see, otherwise anyone looking at an OS look elsewhere. Either try 2009.1 or a different distribution altogether — this one’s not ready.

March 7, 2009

DRM is Obnoxiously Stupid

Filed under: Computers, Freeware, Movies, Philosophy — Tags: , , , — Zurahn @ 9:06 pm

As you should already know, I spend a creepy amount of time on the computer; consequently, my monitor is of much better quality than my television. So after I had recently ordered and received a copy of The Dark Knight on DVD, I decided instead to watch it on my computer.

Another point you should realise is that I run Linux, which isn’t necessarily a friend of the movie industry. Lastly, you must be aware that DVDs contain Digital Rights Management software.

There is a library available in Linux called libcss2 that allows playback of movies with copy-protection. However, The Dark Knight happened to have revamped its encryption and decided to just curl up in a ball and say “nu-uh!” to the person who just legally purchased the film.

Whatever intended purpose there was to this, it was clearly an empty obscene gesture to those who just wanted to watch the movie, because the disc obviously contains the movie, and that being said, it’s not going to be difficult to watch it anyway.

mplayer -sb 2500000 dvd:// -alang en

And it plays just fine. A single command line function to essentially skip the copy-protection.

Instead of trying to stop people from using the items that your customers have purchased, perhaps spend a little more time understanding that by doing so that makes simply downloading the movie a better product.

February 15, 2009

Enemies of Information

Filed under: Computers, Philosophy, Politics — Tags: , , , , , — Zurahn @ 8:53 pm

The Internet has very quickly gone from a niche hobby to a revolutionary worldwide connection of information and ideas.  We all started at different points when it comes to our exposure to it, leaving mixed feelings on its place and purpose.  At the heart of the direction of the Internet on every level, there are two diametric philosophies: politicization leading to corporatizing, proprietizing and homogenizing, versus the free software movement pushing toward transparency, collaboration, expression and freedom.

Whether it’s from simply the software side where GNU and Linux has been growing steadily in quality and marketshare both on the OS side and just free and open-source projects such as Audacity, GIMP, Firefox and many others thriving on Windows itself, or within the realm of legislation in the debate over net neutrality and the importance of the freedom and anonymity of web users and web content.

So, this story is par the course, but no less misinformed, misleading and downright wrong than any other on the side of enforcing the unenforcable on the Internet.

The first thing to note about the article is that the purported security risks relate not the structure of the Internet, but the passing of information itself.  It’s not that the Internet is insecure, it’s that Windows is insecure.  How many of those 12 million computers were running Linux?  There will never be a perfectly secure OS, but the point is that the vulnerabilities were in software, not distribution.  Hilariously, the article says Conficker succeed by “easily sidestepping the world’s best cyberdefenses.”

For one, the answer to my earlier question on Linux is zero, because Conficker is Windows-only.  Second, Conficker is a worm, which means it spreads by scanning ports then exploiting a service, in this case port 445 — a known malware hotspot that should unless absolutely necessary be blocked for all incoming traffic.  A single obvious firewall setting stops it easily, and merely passing your connection through a router at default settings will likely do the trick on its own.  World’s best?  It’s not impossible to run a secure Windows machine, just as it’s not impossible to infect a Mac.

The second important point is that their solution never once mentions security in terms of technology or programming.  Security by law enforcement is just absurd.  Perhaps a result of American self-absorbtion, but it always seems to be forgotten that the Internet is worldwide.  Good luck with that driver’s license methodology in stopping scams from Nigeria.

There are inherent security problems with the architecture of the Internet due to its initial roots, the article actually has that correct, but they are way off base in terms of what those weaknesses are.  The problems are the public protocols, which have been forced to be updated, the most obvious example being HTTP which was designed as plaintext, but due to security issues SSL encryption was built on-top.  Similarly, DNS has never exactly been the most immutable, hence the push for DNSSEC, a replacement with security in mind.

What’s holding DNSSEC up?  Most ISPs can’t handle the increased overhead.  Redesigning the Internet would do nothing to improve the stubborn western ISPs who have neglected investing in infrastructure and instead opted for milking the consumer as much as possible.

And ultimately, nomatter how you structure the Internet, you have to accept that fact that you can’t ignore the problem of the Dancing Pigs — most users are going to do what they want, security be damned.  This is inherently and necessarily an Operating System problem if anywhere.  The truth is, the underlying problem with security is not in protocol — the security in that is only supplementary, at least in terms of something along the lines of a worm or virus infection — but rather, PEBKAC, and nomatter what you do, that will forever and always be the case.

I can improve the security of how the user interfaces to the Internet by an order of magnitude by changing your login–don’t run as a super-user (Administrator) and that will severely cripple the vast majority of existing issues.  Sandbox to eliminate nearly everything else.  Add on continual improvement toward phishing and malware reporting in browsers themselves and we can do this.

Meanwhile the underlying philosophical concepts are just as harmful, with the article stating, “users would give up their anonymity and certain freedoms in return for safety.”  Who is it here that has not witnessed the incredible depletion of American freedoms under the guise of security and the devastating consequences?  Never more evident has it been the accuracy of Benjamin Franklin’s statement, “They who can give up essential liberty to obtain a little temporary safety, deserve neither liberty nor safety.”

The Internet is by necessity neutral and anonymous.  A cumulative database of all the knowledge of all of mankind available to every individual connected is already one of the most important progressions in history and can only become more essential with time.  Legislation and propaganda by those who know the least what they’re trying to undermine are not only ignorant, but treasonous; not to a nation, but to mankind.

February 12, 2009

Week Three in Ubuntu

Filed under: Uncategorized — Tags: , , — Zurahn @ 1:33 am

Well, I’m about done. Ubuntu has effectively had two results:

1. I like Linux
2. I dislike Ubuntu

The Linux structure I appreciate and of which am in favour, but Ubuntu is quite simply unstable. I managed to crash all of X.Org by merely switching desktop effects back on. While no doubt partially attributed to ATI’s crappy Linux drivers, it’s merely the cap of several Gnome freezes. I think when the OS collapses on itself, it’s time to look elsewhere.

I’m going to start fresh now, likely trying the latest Fedora release, also considering Freespire, SuSE and Puppy Linux. The state of operating systems is much akin to browsers a few years ago. There was a point where browsers couldn’t quite get things right; where one would have a feature, it would lack several others. Over time they’ve moved toward the same goals and progressed to the point where it’s to go wrong. We can only hope the same happens here.

Two Weeks with ubuntu

Filed under: Computers, Freeware — Tags: , , — Zurahn @ 1:32 am

I have spent the better part of the past two weeks in dereliction of my usual daily use of Windows XP in favour of better familiarizing myself with Ubuntu 8 specifically as well as Linux in general. Overall it’s a positive experience, but as per prior experience has had its share of issues.

The first thing to note is a subjective “feature” per se that I haven’t tested, but in general I seem to have found that Ubuntu is relatively easy on the CPU compared to XP. Everything is quite snappy, though this is partially attributed to the fact that I am running without visual effects.

Perhaps related, web browsing is very responsive, with Firefox running far better than in Windows. Unfortunately the text rendering in my Windows favourite of Opera is poor, and overall isn’t nearly as polished, making Firefox my lead choice, which has never been the case in Windows. I have two main complaints, however; one is that scrolling isn’t as responsive on some sites, particularly on The VG Press forums, unfortunately, and the other is that flash implementations are clearly not as polished.

My favourite little function in Windows is one I got the idea from Linux, which is using the run box to open programs. I’d set a shortcuts folder in the PATH variable and put shortcuts in that folder, such that I could press WINDOWSKEY+R and enter “firefox” and up comes Firefox. Well, in Ubuntu I’ve mapped the Windows key to open the run box, and installed programs are available there by default. Unfortuntely I haven’t found a way to create my own run commands as yet.

It also wouldn’t be a Linux desktop experience without some driver troubles. My Sandisk Sansa was a fight to get working, but eventually I did manage to find the lsusb command which for whatever reason automatically mounted the device whereas prior it wouldn’t detect. An unresolved issue was with my printer which just wasn’t going to cooperate at all. Lastly–and this is more of a problem with my PC than with Ubuntu–is that recording audio wasn’t initially functional and required the installation of kmix; this is due to the fact that Sigmatel and Dell decided to gimp the soundcard functionality.

The important thing here, though, is that the system works. Ubuntu is a distribution focused on creating a user-friendly desktop package, and it’s just about there. It comes built-in with Firefox, OpenOffice, Evolution (Outlook equivalent), Pidgin (instant messenger), Brasero (disc burning), GIMP and more that can take care of the average users’ needs.

What is the trick though, and a potential dealbreaker is the handling of new software. A significant methodological difference in GNU/Linux from Windows is the use of a package manager. How a package manager works is that there are software repositories that have a selection of software to choose from that populates the package manager’s list of programs. From that, you can select what to install. Of course, there are other ways to install software, including installers (Ubuntu is derived from Debian, so it uses .deb files for installation) or compiling the source code, which is usually in C.

As a heavy computer user, this system can be very convenient. At any point, I can just run a command and have my program, just as I did shortly prior to writing this, I installed Lynx as easily as running sudo apt-get install lynx. Even with compiling source code (which is a pain) it has the advantage of being able to easily make edits to the program.

The problem is that the goal of Ubuntu is to breach the mainstream market, for which the current set-up is not well designed. Ask the average person what a package manager is and you’ll get a blank stare. Similarly, command-line is an immediate killer. And a perhaps unavoidable issue is that installers need to be tailored to the OS and there are a lot of versions of Linux.

Lastly, a somewhat ironic issue has been that I’ve found the system relatively unstable compared to what I’ve been used to in XP. While, as I mentioned, the system is very responsive, I have managed to freeze the Ubuntu equivalent of Explorer on a few different occasions and have had to restart to resolve it (though I wouldn’t count out a better way to fix it).

As time progresses, Ubuntu slowly manages to become more and more a potential mainstream breakthrough. It’s still not there, but enough hassle has been worked out that for light beginner users I would seriously recommend it. For someone who just needs Internet access with some e-mail, maybe a little music, Ubuntu’s got it well covered and you don’t have to worry about them trying to download Antivirus 2009.

So what about me? I’ve still got some work to do. As yet I haven’t gotten around to setting up a programming environment, but I wholly intend to. I do still have to settle on an editor; unfortunately Notepad++ is not available in Ubuntu, so without running WINE or adapting the source code, it’s time to learn a new editor. It’ll likely never be the case that I’ll have abandoned Windows as it’ll likely never be the case that Windows application development will be abandoned.

There’s still the chance that Ubuntu or perhaps another distribution could take over the vast majority of my time, but for most users, I don’t think it’s worth the effort just yet.

January 19, 2009

Unintended Consequences

Filed under: Computers, Freeware — Tags: , , , — Zurahn @ 12:51 am

Being one to constantly install operating systems I have no need for and ultimately no serious intention to seriously use, this weekend I installed the Windows 7 BETA. For what little I have tried of it, it is indeed quite nice and very polished. Proper support for Asian character sets certainly eased my worries after the frustration I had trying Vista.

Windows 7 has appeared to have done what it really needed to do. It completed the change that Vista started to make the OS palatable to the average user and convey that this is something that’s worthwhile, feels fresh and is a genuine improvement. It looks crisp, the UI is friendly and it installs easily. I’m sure with time I can find annoyances, but in general it’s a win for Microsoft.

However, there is in fact, a problem. A big one, at least in terms of people such as myself who already knows XP through and through but are looking for an update, which is that the one thing Windows 7 made me want to do above all else is try Ubuntu again.

The problem is change, because Windows 7 actually offers some. The fact that Windows 7 has a learning curve to the serious users means that the learning curve holding me back from jumping to Ubuntu or another distribution of Linux is now significantly mitigated. I already know that for the vast majority of my computer activity, Linux can provide that. The question is whether or not it would be an improvement — and that same question applies to Windows 7.

Microsoft has done what it had to with Windows 7, which is to provide some palpable change, even if it’s superfluous, so there’s no blaming here. Only warning that as long as I’m changing, it’s tempting to just go all the way.

October 26, 2008

Configuring a LAMP Server

Filed under: Computers, Freeware — Tags: , , , , — Zurahn @ 10:39 pm

This is a tutorial on how to configure a LAMP (Linux Apache, MySQL and PHP) server from start to finish. For space and simplicity’s sake, it is assumed you already have a partition available and formatted for a Linux installation.

The first task is a Linux distribution. There are many choices available, and the effectiveness varies on your purpose. For large servers, Red Hat is very popular. For our tutorial, we’ll be using Damn Small Linux (DSL).

You’ll need to get the ISO from the DSL download page. Once downloaded, burn the ISO image to a CD. If you don’t have a program to do this, DeepBurner is a great free program.

The CD will be used as a boot disc. It should automatically load, but if it does not, you need to set your BIOS settings to boot from CD before the harddrive. You get to the BIOS by pressing either DEL, F2 or F10 (depending on which BIOS you’re using).

Once DSL has loaded, you want to right-click the desktop and choose to install to harddrive then go through the steps.

Restart and load DSL. Start Firefox and go to the XAMPP for Linux page and download the .tar.gz file.

Open a terminal window now, and copy the file to /opt (the following command assumes you saved the file to /root. It could have been saved somewhere else.

sudo cp /root/xampp-linux-1.6.7.tar.gz /opt

Now you can extract the files

cd /opt
sudo tar xvpf xampp-linux-1.6.7.tar.gz

Finally, you can start the server

sudo /opt/lampp/lampp start

To check that the server is running, go to
http://localhost

You should get the XAMPP page. Now you need to configure the security for XAMPP. Back in the terminal, run the following command

sudo /opt/lampp/lampp security

This will guide you through setting passwords for the different XAMPP functions. Once this is done, you’ve completed the basic functions for your LAMP server. Now it’s merely customization and tweaking.

If you need to edit the Apache configuration, the text editor to use in DSL is Beaver, and httpd.conf is located in /opt/lampp/etc/httpd.conf

To edit the file, the command is simply

sudo beaver /opt/lampp/etc/httpd.conf

Similarly, the same is done with the PHP configuration which is in the php.ini file.

sudo beaver /opt/lampp/etc/php.ini

Any changes to either file will not take effect on the server until it is restarted.

sudo /opt/lampp/lampp restart

Blog at WordPress.com.