Tuesday, October 13, 2009

Without Open Source, Free Software Is Just A Bunch Of Freetards!

Alright, I decided to cover one of the previous articles in more depth. In this truly stupid article, entitled Without Free Software, Open Source Would Lose its Meaning, Glyn Moody tries to make the case for a hard-line stance on 'Software Freedom.' Here are the lulzy results.

NOTE: The italicized text contain quotes from Richard Stallman.

The only reason we have a wholly free operating system is because of the movement that said we want an operating system that's wholly free, not 90 percent free.

Open source exists because of a refusal to compromise by the creators of free software programs. The “pragmatism” that Matt lauds is only an option for open source because the people who did all the hard work in creating free software refused to compromise initially.

Bullshit! The first GNU project was not an operating system kernel. It was a compiler! (Although one can make the case that it was a text editor.) Subsequent projects included shells and various Unix utilities, but they all ran atop proprietary (or BSD/MIT) operating systems. While the overcomplex HURD project was faltering, a truly 'free' operating system emerged from the Linux project, which was originally only free for noncommercial use! Talk about no compromises!

Ten years ago, Stallman pointed out the dangers of compromise:

If you don't have freedom as a principle, you can never see a reason not to make an exception. There are constantly going to be times for one reason or another there's some practical convenience in making an exception.

What, you mean like the exception you made to working on top of proprietary operating systems while developing your visionary vaporware microkernel?

Compromise is a slippery slope: once you start down it, there are no obvious places to stop. This plays right into Microsoft's hands: its current strategy is to dilute the meaning of “open source” - classic “embrace, extend, extinguish” - until it becomes just another marketing buzzword, applied routinely, and ultimately with no real value.

So what? You may ask. If, as Matt writes, the whole point is “to go mainstream”, then such blurring of the line separating free software from non-free software is surely a small price to pay to achieve that wider use of open source. It might seem so in the short term, but I don't believe it's a wise strategy in the long term, even from a purely pragmatic viewpoint.

Ahh, the classic slippery slope fallacy rears its ugly head again! The obvious place to stop is when the community decides the perceived drawbacks to further cooperation outweigh the perceived benefits.

Moreover, if the term “open source” becomes devalued, coders and users will become disillusioned, and start to desert it. The former will find the sharing increasingly asymmetric, as their contributions are taken with little given in return

If the coders do not see the value of contributing to a specific project, then they will either find a new project or take the existing code (one of the fundamental definitions of open source is the right to fork) and start a new project. This is what happens right now!

(something that may well happen even to open source companies using the GNU GPL if they demand that contributors cede their copyright, as most currently do).

This problem already happens, and it is dealt with. Even the GNU project insists on copyright, and even they have changed the terms on licenses occasionally. I do remember that there were a lot of complaints about GPLv3.

But, of course, the point is not “to go mainstream”: as Stallman said, it's about having “freedom as a principle.”

If the point is not to go mainstream, then how do you think freedom will be spread?

And because this is how he fights for freedom, without compromise, he is prepared to do and say things that people in the pragmatic world of open source find regrettable – shocking, even. That's partly because it inconveniently makes their job of “going mainstream” harder, and partly because of a genuine distaste for some of Stallman's actions. But what they overlook is that freedom fighters – for that is how Stallman regards himself – have always been so focussed on their larger goals that mundane matters like convenience and good manners tend to fall by the wayside.

Wow! Even Linux Journal admits that RMS is a terrorist!

Ultimately, the reason that free software cannot compromise is because we compromise over any freedom at our peril: there is no such thing as 50% free.

There is also no such thing as 100% free (at least not in the realm of "Free" Software). Even "Free" Software places limits on what someone is free to do with it.

As history teaches us, freedom is not won by “going mainstream”, but by small numbers of stubborn and often annoying monomaniacs that refuse to compromise until they get what they want. The wonderful thing is that we can all share the freedoms they win, whether or not we helped win them, and whether or not we can live up to their high standards of rigour.

The monomaniacs may lead the way to freedom, but they always do so with the masses right behind them. Yes, even the hardline groups have to make SOME compromises. If Free Software evangelists cannot win over the masses, then Free Software is doomed to shrivel-up just as hundreds of other ideologies have done.

Hey! You don't have to take my advice. You go right ahead with your Free Software Foco Theory. I will just sit back and laugh all the way to the release party for Windows 8.

Monday, September 28, 2009

Rants and Laughs 12

Since I am lazy, I will provide you with a lazy update.

  • If you are a luser, then the last usage example on the chown Wikipedia page will send you into fits according to Linux Reddit.
    $ chown -R us base

    Isn't that a riot?

  • Apparently some freetard feminists are angry about Shuttleworth commenting on the fact that Lunix is hard to explain to women. Freetards and Feminists! These sound like my kind of people!

  • Mat Asay, in his article Free Software Is Dead. Long Live Open Source! drops a bomb on the freetards by saying that open source needs to survive. Of course, the freetards tear him a new one for that.

  • A Reddit luser asks where lusers go to buy music. Uh, Mr. Freetard, lusers don't buy music. That is what P2P is for. Haven't you ever heard that "information wants to be free?"

  • Another luser asks the community about Linux compatible laptops. Dude, just buy a Macbook!

  • Well, Ubuntu Karmic Koala is almost here, and it is shaping up to be a pretty uninspiring release. Here are ten notable things about the new Ubuntu, and none of them mention the title that sounds almost as funny as Masturbating Monkey.

  • Here is another LJN article. It asks if Microsoft still has an open-source strategy. Sure it does! Microsoft's open source strategy is to find out how it can make money given or despite open source. That is its strategy.

  • The wingnuts at Boycott Novell are at it again! Apparently, Richard Stallman has been the recipient of some well-founded shit-flinging recently, and BN has had it up to here with that guff! The attacks mostly seem to revolve around rms's screed a few months ago about Mono and Tomboy. Jeez! Get a life, people!

Well, that is all for now. I have a few more indepth comments in the pipes, so check back soon!

Thursday, July 23, 2009

Taking Out The Trash Operating System

It is finally time to talk about one of the most hated design decisions (among many worthy candidates) of the Unix (and friends) operating system: irrevocable file deletion. You see, when you delete a file using, say, the rm command, it decides to simply delete all references to the file from the filesystem and mark the space as free. This means that, as far as Unix is concerned, the file is gone forever!

Modern operating systems, such as Tenex, have included a 'garbage can', so when you 'delete' a file, it goes into the garbage can, so if you accidentally deleted a file you need, you can go in there and get it back. Files are only permanently deleted if the user explicitly 'empties' the garbage can or the system needs more memory (at least this is how it works on Windows).

In a (mostly futile) attempt to be more user friendly, the GNOME and KDE devs looked at the trashcans every OTHER operating system used and decided to come up with their own half-assed implementation. Eventually, in an amazing display of cooperation, they even decided to standardize on the same directory, and people say Linux devs can't cooperate! Of course, since this functionality is not built-in to the operating system but is instead a matchsticks-and-glue implementation that rides on the Desktop environment, it has some problems.

First, it only works with programs that explicitly support it. Programs like rm or any GUI tool that is just a wrapper over shell tools (i.e. most of them) will still delete the files as usual. So, sometimes you can retrieve your important file, and sometimes you can't.

Second, since the trashcan is just a regular directory, it can quickly fill up, and if the user does not manually empty it regularly, the hard drive fills up as well. Just like an extremely lazy person, Linux doesn't take out the trash even if the whole house is filled with shit!

While writing my last article, I came upon an error when I quit Avidemux; it read something like "Unable to create Avidemux log. Filesystem is full." Leaving aside the fact that it was creating a log while it was exiting, this was worrisome. Well, I had been copying a lot of stuff onto the hard drive a few days ago, but I had deleted most of it. I then decided to open my garbage can and empty the trash. Lo and behold, 3 GBs of space opened up instantly!

Well, class, what have we learned today? First, not having a two-step deletion process can cause tens of millions of dollars of damage. Second, not integrating the trashcan leaves the user unsure if he can recover his files if he accidentally deletes them. Third, not monitoring the pseudo-trashcan means one can run out of space very quickly.

Class dismissed!

Wednesday, July 22, 2009

Lights! Camera! No Action!

I am sorry about the lack of posts. Linux just has not interested me in a while. Anyway, today we will take a wonderful look at the world of video editing under Linux. Now, Linux's suckiness when it comes to video-editing is well known, but I am not talking about professional video editing. Today, I am going to look at how Linux handles a very basic feature.

Here's the deal. I have a bunch of porn videos. I want to trim out the crap to leave only the good stuff. The usual way to do this is to split the video into a bunch of clips and then assemble the good clips into a movie. This is easy with Movie Maker, and it is slightly less easy with iMovie. Now let's try it on Linux...

NOTE: I am using Ubuntu 8.04 x86-64 LTS for this comparison, since it is the Linux system I have installed. All the packages come from the Ubuntu repositories. I tried to use every video editor I could find in the Add/Remove section.

The main contenders are the following.

  1. Avidemux

  2. Kino

  3. Kdenlive

  4. Open Movie Editor

  5. Pitivi Video Editor

Let's see how each stacks up.

NOTE: If you want to follow along, you can find the video I am using here. For those with the IQs of lusers, the site is definitely NSFW! The link is down.

FURTHER NOTE: GIMP, being the wonderful program that it is, decided it needed some screen time in some of my images. I am too lazy to redo them, so you will just have to cope. Ah, the wonders of Open Source!

EVEN FURTHER NOTE: Anyone who complains about the movie being WMV can feel free to go back to linusporn.com and pretend the rest of the world uses Ogg Theora.


Alright, let's fire it up!

Okay, this looks decent enough. Let's click "Open" and select "Chrissy.wmv".

Well, this one actually works! Sorta....

Now, since it plays the video properly, it should be able to copy it properly too. Let's just select a portion of the video with the A & B icons. It is actually quite user-friendly. Let's just leave the Video and Audio options as "Copy". It should now create a new WMV file (I named mine chrissy2.wmv). Now, let's open it!

Well, it looks great, but there is no sound! Well, I don't really want to fuck around with audio options, so let's continue on.


Kino looks like it is a great, simple video editor .... for camcorders!! However, since we are desperate, we might as well try it.

Let's click "File->Open" shall we?

Well that's no good! Still, we might as well try to import it. Who knows? It might work!

Since PAL is for Eurofags, I will choose NTSC. You choose whatever you like.

Oh gosh! It failed! Who woulda thunk?!


There is no way in hell I am bringing down all the KDE dependencies for a stupid video editor!

Open Movie Editor

As usual, let's fire it up.

You figure it out!

Alright, drag the Chrissy.wmv onto the Video and Audio tracks. Select a clip. Then click on Project->Render.

Choose to render it as a Quicktime file (the only option, of course). Then click Encode. After several minutes it should give you this.

If you are not following along, you can simulate the image by rocking your head back and forth really fast. It seems to have recorded two frames and looped them for several seconds. At least, the audio is working!

Pitivi Video Editor

Last but (barely) not least, we come to the Pitivi Video Editor. Let's fire it up!

Cool! We can just drag Chrissy.wmv onto the Clips window! Drag & Drop is so awesome .... for 1995!

Well, it seems to support the file, but now what? Apparently, the Pitivi developers care very much about the GNOME Interface Guidelines, and they have been reading up on how to write a GNOME application. Pitivi is so simple (READ: stupid) that it can only supports merging clips into one video and not splitting a video into clips! This would have been useful for the second phase of the operation, but it is useless without the first!


Let's summarize. Ubuntu, the most popular Linux distribution among 'ordinary' (i.e. not super-freetarded) users, features several video editors, none of which can make an acceptable movie clip! I am not trying to splice special effects into my Hollywood film reel here! I am just trying to cut out the boring parts of pornos! All I really needed was an editor suitable for editing home movies, and Linux cannot even provide that!

What else is new?

Monday, June 8, 2009

Rants and Laughs 11

It is time once again for another humorous look at the freetard ecosystem.

  • Some Zemlin guy has proclaimed this the Week of the Linux Desktop. Let's look at his claims indepth, shall we?
    Smartbook or Netbook; Common Denominator is Linux
    Apparently, these 'smartbooks' (i.e. a cross between a netbook and a smartphone) use Linux a lot. Remember all of the Linux-based netbooks two years ago? Remember how they all run Windows XP now?
    Instant on runs on Linux
    Yeah, I have already covered this.
    Better Audio and Video Support
    Dude, Real has supported Linux for years now. Big deal.
    Palm “Pre” makes a spash with a Linux based Smartphone
    Dude, I thought Palm was dead. This sounds like another Hail Mary Desperation Pass.
    Intel Buys Wind River
    How is this relevant? Wind Rive is not a desktop manufacturer! Plus, their product line included BSD/OS and vxWorks!
    In short, Dell has released a Linux laptop. Hooray!

  • Apparently, some people find SELinux extremely complicated. But it makes things so secure!

  • Here is a nice tutorial on how to understand a feature of Unix that infected plenty of other operating systems: symbolic links. Of course, Chapters 8 and 13 of the Unix Haters Handbook also cover this material.

  • Here is some really good Linux Hate.

  • It has been nine months since Chrome was released on Windows, and the open source 'community' has still not succeeded in releasing an acceptable version for Linux. This is just disgraceful guys. Doesn't the CATB propaganda say that, once you release the source-code, then legions of flosstards will descend on your codebase and port it to Linux? WTF?!

  • Here is something that involves yours truly. I asked ESR's opinion of a comment I found on Linux Hater's blog; his response mostly evaded the question. What do you think of the matter?

So, that is all for now. I will try to write another indepth article on some freetardery soon.

Sunday, June 7, 2009

New Blog

Well, here is my new blog! Feel free to make yourselves at home.

P.S. Is there a good way to move all the comments from the old blog to the new?

Sunday, May 24, 2009

Beyond Crap

Okay, apparently some freetard who used to work at Microsoft has come out with a book called After the Software Wars that, among other things, touts the 'virtues' of open-source software. Well, I have not read much of it (and it will likely stay that way), but I did read a few sections, and they provided a bunch of unintentional comedy.

GC solves portability issues because programs written in languages such as Java, C#, Python, etc. are no longer compiled for any specific processor. By comparison a C/C++ executable program is just a blob of processor-specific code containing no information about what functions and other metadata are inside it.

If all code written for the Macintosh was written in a GC programming language, it would have been zero work for Apple to switch to the Intel processor because every program would just work!

LAWL!! Okay, where do I began to sort out the idiocy? First GC or Garbage Collection has little to do with "Write Once Run Anywhere." What he is thinking of is a Virtual Machine. VM-based languages often feature garbage collection, but the two features can coexist separately. Many implementations of D generate native code, and LLVM is a virtual machine that (I think) lacks support for garbage collection. Even if all OSX software was written in a virtual machine, it would have been a serious undertaking to both write and then port a virtual machine that gives acceptable performance (look at Sun's efforts to make Java not run like crap). Alright, let's read some more.

Apple’s second kernel wasn’t built from scratch, but is based on Berkeley Software Distribution (BSD) Unix code. This code is a lot like Linux, but with a smaller development community and a noncopyleft license agreement. That Apple is depending on a smaller free kernel community, and yet doing just fine, does say something about free software’s ability to deliver quality products. The BSD kernel is certainly much better than the one Apple threw after 20 years of investment!

Unfortunately, in choosing this software, Apple gave life support to a group who should have folded their code and resources into Linux. The free software community can withstand such inefficiency because it is an army of millions, but, from a global perspective, this choice by Apple slowed progress.

Oh snap! I bet that {will,has already} start{,ed} a few flame wars. This is even funnier since it comes a few pages after he jerks-off to the hundreds of Linux distributions that freetards have created.

When I visit coffee shops, I increasingly notice students and computer geeks purchasing Macs. Students have limited budgets and so should gravitate towards free software. If Apple doesn't support free software, their position in the educational market is threatened.

Apparently, this guy has not met any students recently. Sure, they are strapped for cash; this is why the pirate the shit out of everything! If free software can compete with pirated commercial software, then it stands a chance. Otherwise, nada.

Many computer geeks buy a Mac because of its Unix foundation.

The stupid, it burns!!!

In the terminal window of both the Mac and Linux, you type “ps -a” to see the list of processes.

Oh, wow! It has ps! That is, like, so awesome!

(Windows doesn't support the Unix commandline tools.)

Oh noes! Apparently, this guy has never heard of Cygwin, MinGW or even Microsoft Windows Services for Unix.

Apple has good Unix compatibility only because their programmers never took it out while doing their work. It was never any goal of the Mac OS-X to be appeal to geeks — Apple just got lucky.

Yes, they are so lucky to have that 0.1% of their market. He saves the best for last.

After having been a long-time Windows user, and a 100% Linux user for 3 years, I tried out the Mac OS X for a couple of days. Here are some impressions:

Prepare to have your freetard socks rocked!

● A Mac OS has more code than ever before, and a lot of it is based on free code, but it doesn't have a repository with thousands of applications like Linux. There are several third party efforts to provide this feature, but none are blessed or supported by Apple. The Mac comes free with iPhoto, but they really want me to buy Aperture for $159, which they tell me just added 100 new features! Apple ships a new OS every year, but you don't get free upgrades — it costs $140 even to upgrade from OS X 10.4 to 10.5.

First, it seems like Apple now releases a new OS every two years. Next, most Apple users don't care about 95% of the crap in those repositories, so Apple does not want to spend the money needed to maintain a high quality repository.

● Many of the Mac's UI details like how to maximize windows, and shortcut keys, are dis-similar to Windows. Linux, by contrast, feels much more natural to a Windows user. Every time you double-click on a picture, it loads the iPreview application that stays around even after the window displaying the picture is closed. How about just creating a window, putting the picture in that window, and having it all disappear when I close the window? I tried to change the shortcuts to match the Windows keystrokes, but it didn't change it in all applications.

Then you need a window bar along one of the sides of the screen. I will admit it is a bit weird, but the solution seems better than cluttering up the interface.

● The Mac feels like a lot of disparate pieces bolted together. The desktop widgets code has its own UI, and it doesn't integrate well into the OS desktop. The Spaces is a clone of an old Unix feature and doesn't implement it as well as Linux does. (For example, there is no easily discoverable way to move applications between spaces.)

Linux does not?! Linux IS a lot of disparate pieces bolted together.

● As mentioned above, the Mac doesn't support as many of the Microsoft standards as Linux does. One of the most obvious is WMA, but it also doesn't ship with any software that reads DOC files, even though there is OpenOffice.org and other free software out there.

At least the functionality exists! On Linux, you need to download some potentially illegal codecs to even play MP3s!

● It is less customizable. I cannot find a way to have the computer not go to sleep when the laptop screen is closed. The mouse speed seems too slow and you can only adjust the amount of acceleration, not the sensitivity. You cannot resize the system menu bar, nor add applets like you can with Linux's Gnome.

That is funny. I cannot make my Linux laptop GO to sleep when the lid is closed.

These are just a few of the 'insights' you can find in this amazing tome. If you want, you can buy the book from Amazon, or you can send him a small donation. Make it a penny.

Monday, March 9, 2009

Rants and Laughs 10

Alright, it is time once again to see what is going on in the Linux 'community.'
  • Here is a good rant about bug triagers. Yes freetards, even if a bug is years old, you should not close it without testing to make sure the problem is actually fixed!

  • Distrowatch has a good vindication of LH's argument that most Linux distros could be achieved by simply reconfiguring another distro. The amount of wheel reinvention going on is amazing even to me!

  • Lusers have been flipping out over someone suggesting ten ways Ubuntu could improve. Most of these suggestions, especially the inclusion of a media center are pretty good fucking ideas, which means lusers are going to get pissed over anyone bringing them up. A particularly freetarded response tried to take the author to task for his suggestions and just ended up making a fool of himself. Most of his nipticks boil down to 'oh, there is this utility you can install that kinda sorta does what you want, but you first have to know that it exists and then jump through hoops installing and configuring it', or 'that feature is in development and will be available in Zesty Zebra' or 'media center, we don't need no stinkin' media center!' He ends with a bang, though,
    So I’m unimpressed. Ubuntu already has the majority of those features (or a close-enough analogue), that guy failed miserably in doing his homework before posting that, and even the things that Ubuntu doesn’t have are Linux/GNOME/KDE/Nautilus/Dolphin deficiciencies, not Ubuntu problems.
    Yes, that loser totally did not do the proper fifty hours of research to make Linux do what he wanted. He just sat down and expected it to function properly, the moron! Plus, all those problems are the fault of the ISVs not the fault of the distro, whose job it is to take all the various pieces of software and integrate them into a polished, cohesive whole. The freetard is strong in this one!

  • A reddit luser asks what idiot designed the GTK File Dialog. As usual, comments are required reading.

  • Linux Kernel, a.k.a. Erotic Pickled Herring, has been released. Way to show the world how mature you are, guys!

  • A luser asks why do you use Linux? He mostly seems to like Ubuntu's nice file dialogs, its resistance to viruses, Gedit and Grsync. So basically, you like it because of a text-editor and a syncing application. Dude, just buy a Mac! You will like its interface, Time Machine and TextMate better. You know why I use Linux? Because it is the only way to keep up on the hilarious stuff you freetards have come up with to torture yourselves and the idiots you convince to join your cult.

  • Some luser thinks a bunch of cheap (mostly proprietary) office clones spell the death of Microsoft Office. Yeah, in your dreams, lusers! What you lusers don't seem to understand is that people are perfectly knowledgeable about alternatives, and if they are provided with a cheaper alternative that still lets them do what they want to do, they will switch in droves. OpenOffice is an inferior product to Microsoft Office, and people who value their productivity more than $200 will find it cheaper to remain with Microsoft. You mention the recession, but you forget that recessions also cause investment funds to abandon wild-eyed schemes and focus on profitability, and last I heard, RedHat was the only major open source distro that was profitable.

  • TechRadar has a good post on how to make your LUG not suck. I was once part of a LUG at my university. It fell apart after several meetings after it was clear that few people were interested in Linux.

  • Apparently, multihead support is still broken in Linux. Business as usual, I know.

  • Here is another article detailing stupid tricks you can do with the bash shell. I almost did not want to post this, but the opening is just too good not to reproduce here.
    If you've ever used GNU/Linux, chances are good that you've used bash. Some people hold the belief that using a GUI is faster than using a CLI. These people have obviously never seen someone who uses a shell proficiently.
    Yes, I have had to use Bash. Yes, I also regret all the time I spent learning to put up with its bullshit.

  • Some luser is pissed that he cannot remove Evolution without removing the entire Ubuntu desktop! The failure goes all around this time. First, there is the luser himself who wants to replace a broken email client with one that appears to have died several years ago and nobody noticed. Attention Luser! Install Thunderbird, delete the Evolution icons from your Applications bar and auto-launcher (you can do that in the wonderfully configurable GNOME desktop, can't you?) and get on with your life. We're talking about <5MB here! Then, there is the distro itself, which takes all the thousands of claims by lusers, "well, Ubuntu is better than Windows because of its customizability", and shoves them up its ass!

  • Here is a Reddit discussion on how to read the contents of RAM in a human-readable way. The thread itself isn't very funny or enlightening, but one of the (probably serious) comments certainly is.
    Or you can write a progam in C that traverses the ram, writes it to a file and then use a hex editor (Emacs in hex mode, for example M-x hexl-mode to look through that.

Soon, I may make another tutorial for you guys (although no one gave ANY feedback, positive or negative, on the last one). Until then, have fun recompiling your new kernel!

Friday, March 6, 2009

Myths about Linux

Okay, here is another really stupid luser site. This one claims to 'debunk' the top 10 'myths' (i.e. facts) about Linux.

NOTE: Well, apparently it was created in 2005 (stupid linux reddit; this isn't new), but I did not realize it until this rant was mostly done. It doesn't matter much because lusers are still making the same claims, and they are still totally full of shit. Let's give a response. It is rather amazing and quite sad.

Myth 1: Linux is too difficult for ordinary people to use because it uses only text and requires programming.

The truth: Although Linux was originally designed for those with computer expertise, the situation has changed dramatically in the past several years. Today it has a highly intuitive GUI (graphical user interface) similar to those on the Macintosh and Microsoft Windows and it is as easy to use as those operating systems.

Having a GUI does not automatically make Linux easy to use. The GUI has to be designed with the users needs in mind, and this is something that lusers have demonstrated an inability to do well.

No knowledge of programming is required.

Wow, I do not need to know how to code quicksort in Intercal to browse the web! This is almost Mac-like friendliness!

Moreover, once people become familiar with Linux, they rarely want to revert to their previous operating system.

So why are all those netbooks being returned?

In some ways Linux is actually easier to use than Microsoft Windows.

In some ways, you don't know what the hell you're talking about.

This is in large part because it is little affected by viruses and other malicious code

Yeah, LH already covered this.

system crashes are rare.

This has been covered too.

Myth 2: Linux is less secure than Microsoft Windows because the source code is available to anybody.

The truth: Actually, Linux is far more secure (i.e., resistant to viruses, worms and other types of malicious code) than Microsoft Windows. And this is, in large part, a result of the fact that the source code (i.e., the version as originally written by humans using a programming language) is freely available. By allowing everyone access to the source code, programmers and security experts all over the world are able to frequently inspect it to find possible security holes, and patches for any holes are then created as quickly as possible (often within hours).

You forgot to mention that giving access to the source code also allows lusers who don't know what they are doing to seriously fuck things up.

Myth 3: It is not worth bothering to learn Linux because most companies use Microsoft Windows and thus a knowledge of Windows is desired for most jobs.

The truth: It is true that most companies still use the various Microsoft Windows operating systems. However, it is also true that Linux is being used by more and more businesses, government agencies and other organizations. In fact, the main thing that it preventing its use from growing even faster is the shortage of people who are trained in setting it up and administering it (e.g., system engineers and administrators).

Really. If there was such a serious demand for Linux sysadmins, I think the 'shortage' problem would have been solved by now. There seems to be no shortage of expert lusers on the 'net.

Moreover, people with Linux skills typically get paid substantially more than people with Windows skills.

The reason lusers get paid more is because it takes a lot more skill and work to manage a *nix system. Unix has been called an Administrator Full Employment Act.

Myth 4: Linux cannot have much of a future because it is free and thus there is no way for businesses to make money from it.

The truth: This is one of those arguments that sounds good superficially but which is not borne out by the evidence. The reality is that not only are more and more businesses and other organizations finding out that Linux can help reduce the costs of using computers, but also that more and more companies are likewise discovering that Linux can also be a great way to make money. For example, Linux is often bundled together with other software, hardware and consulting services.

Yes, that is all well and good, but what is the business model if you want to sell software. Not everything can fit under the 'services' umbrella. If you have to depend on hobbyists, you are screwed.

Myth 5: Linux and other free software is a type of software piracy because much of it was copied from other operating systems.

The truth: Linux contains all original source code and definitely does not represent any kind of software piracy.

Linux may not represent piracy, but it still copies. Linux is a copycat of Unix, and most of the Linux GUIs are half-assed clones of Windows.

Rather it is the other way around: much of the most popular commercial software is based on software that was originally developed at the public expense, including at universities such as the University of California at Berkeley (UCB).

WTF? Are you talking about Windows 95's TCP/IP stack? That was thirteen years ago!

Myth 7: There are few application programs available for Linux.

The truth: Actually, there thousands of application programs already available for Linux and the number continues to increase.

I think you mean there are thousands of crappy applications available for Linux. How much of that shit is actually worth using?

Myth 8: Linux has poor support because there is no single company behind it, but rather just a bunch of hackers and amateurs.

The truth: Quite the opposite: Linux has excellent support, often much better and faster than that for commercial software.

What was the last commercial app you used?

There is a great deal of information available on the Internet and questions posted to newsgroups are typically answered within a few hours.

The same is true for both Windows and OSX.

Moreover, this support is free and there are no costly service contracts required.


Also to kept in mind is the fact than many users find that less support is required than for other operating systems

Just who are these users you're talking about? You and your freetard friends don't count.

because Linux has relatively few bugs (i.e., errors in the way it was written) and is highly resistant to viruses and other malicious code.

Oh boy! Most problems normal people have with software is not the result of glitches. Many users don't even understand the most basic concepts about computers. To develop a product they can use, you have to provide a clean, consistent interface that is both well-documented (so they can look stuff up), popular (so they can get help from their friends), and contains the smallest possible configuration space (to minimize the knowledge necessary to use the product). Linux has none of these things.

Myth 9: Linux is obsolete because it is mainly just a clone of an operating system that was developed more than 30 years ago.

The truth: It is true that Linux is based on UNIX, which was developed in 1969. However, UNIX and its descendants (referred to as Unix-like operating systems) are regarded by many computer experts as the best (e.g., the most robust and the most flexible) operating systems ever developed.

There is a group of people who would take issue with that.

They have survived more than 30 years of rigorous testing and incremental improvement by the world's foremost computer scientists, whereas other operating systems do not survive for more than a few years, usually because of some combination of technical inferiority and planned obsolescence.

Unix did not survive because of technical merits. It survived because it was simple (hence portable) and given away freely to universities. The best systems are not the ones best suited to survive; the worst ones are.

Myth 10: Linux will have a hard time surviving in the long run because it has become fragmented into too many different versions.

The truth: It is a fact that there are numerous distributions (i.e., versions) of Linux that have been developed by various companies, organizations and individuals. However, there is little true fragmentation of Linux into incompatible systems, in large part because all of these versions use the same basic kernels, commands and application programs.

HAHAHA!! I can't believe this luser is saying, "well, because all the distros have mostly all the same apps, Linux cannot be called fragmented." They forget that this fragmentation makes it really fun for IHVs and ISVs to support Linux. Not to mention that the fragmentation makes technical support quite a challenge. The Linux community is tiny enough as it is, but it is now broken up into dozens of little distributions.

Rather, Linux is just an extremely flexible operating system that can be configured as desired by vendors and users according to the intended applications, users' preferences, etc.

Ahh, the fallacy of choice rears its ugly head!

In fact, the various Microsoft Windows operating systems (e.g., Windows 95, ME, NT, CE, 2000, XP and Longhorn), although they superficially resemble each other, are more fragmented than Linux.

You are forgetting that four of those systems have been EOLed and are no longer supported. CE is quite different from the rest, but it is not really considered when talking about the desktop. Anyway, even though the systems are different, they look roughly similar to the average desktop user, and the APIs are similar enough that there is a decent (not great, but decent) chance of an application written for Windows 95 running on Vista. The major differences on the Windows platform are the DOS/NT kernel, Start Menu/Vista shell, various IE versions, and various DirectX versions. Do you really think this compares with Linux and its mass of shells, X11 servers, window managers, desktop environments, graphical toolkits and sound systems? Please.

Moreover, each of these systems is fragmented into various versions and then further changed by various service packs (i.e., patches which are supplied to users to correct various bugs and security holes).

Oh come on! Microsoft releases a service pack every two years or so. Ubuntu, the most popular desktop Linux distro, releases an entirely new version every six months. Do you really want to be making this comparison?

Myth 11: Linux and other free software cannot compete with commercial software in terms of quality because it is developed by an assorted collection of hackers and amateurs rather than the professional programmers employed by large corporations.

The truth: Linux and other free software has been created and refined by some of the most talented programmers in the world

It takes more than programming talent to develop quality software. Alan Cox once said, 'Linus is a great programmer, but a horrible engineer.'

Moreover, programmers from the of the largest corporations, including IBM and HP, have, and continue to, contribute to it.

However, most of these major companies are supporting Linux's development as a server. They don't seem to care much about Linux's use on the desktop.

Myth 12: Linux is free at the start, but the total cost of ownership (TCO) is higher than for Microsoft Windows. This has been demonstrated by various studies.

The truth: A major reason (but not the only one) for Linux's rapid growth around the world is that its TCO is substantially lower than that for commercial software.

Oh, I just have to hear these reasons!

(1) the fact that it is free

as in it costs nothing (except bandwidth)

(2) it is more reliable and robust (i.e., rarely crashes or causes data loss)

Windows has made major strides in reliability as well. Solaris is also a (free) contender.

(3) support can be very inexpensive (although costly service contracts are available)

As mentioned before, you can get free support for Windows as well (with about the same quality). Also, have you seen the pricetag for some of those service contracts! Damn!!

(4) it can operate on older hardware and reduce the need for buying new hardware

So can Windows 2000. Also, low-end desktops are going for $400 nowadays. Servicing old hardware and replacing old parts is likely to be more expensive than buying a new low-end PC every 3-4 years.

(5) there are no forced upgrades

There are no forced upgrades on Windows either. Bill Gates doesn't point a gun to your head and order you to buy a new copy of Windows. Companies upgrade because their system is no longer supported. The same thing is true for every major Linux distro with the exception of Debian stable; Ubuntu LTS releases are only supported for 3-5 years.

(6) no tedious and costly license compliance monitoring is required.

I admit that is a valid point. So out of six points, you have two valid ones. You are doing better than most lusers. However, I highly doubt the difference in sticker price and license enforcement costs make up for Linux's TCO problems.

A major reason provided for the supposedly higher TCO of Linux is that Linux system administrators are more expensive to hire than persons with expertise in Microsoft products.

This is definitely a major issue for companies, but you are forgetting a major problem: Linux desktops tend to have lower productivity than Windows desktops. The transition to a service economy has replaced capital-intensive enterprises with labor-intensive enterprises. When a businesses' biggest cost is their employees salaries, productivity issues are incredibly important. Assume all the desk clerks in a company are worth $30 an hour ($0.50 a minute) and work five eight-hour days; also assume that the cost of one Windows Business license is $280. If Windows Vista gives them a twenty-minute per day productivity boost over Ubuntu, then within six weeks Vista will have more than paid for itself.

Well, I think we have now debunked the REAL Linux myths.

Wednesday, March 4, 2009

How To Write an X11 Application

Linux Hater has not posted any tutorials for a while, so I will do it. Here is my first tutorial; it shows you how to write an X11 application (even a server, especially a server).

  • Be sure to switch the meaning of client and server for no good reason. This makes your app seem avant-garde!

  • Make sure your application follows the ICCCM to a T but is still unable to copy-paste properly.

  • Furthermore, include at least two methods to copy and paste just to cause confusion.

  • Include massive amounts of conflicting code to handle fonts that, at the end of the day, only handles DejaVu Sans properly.

  • Don't bother supporting multiple monitors. Nobody uses those anyway!

  • Make sure your app does not need hardware acceleration to function properly. Save that for the wobbly windows!

  • Alright, maybe your app can use hardware acceleration. But don't you dare try talking to the driver directly and not paying your performance toll to the X server.

  • Okay, maybe you can talk to the driver directly. Or at least you will be able to . . . . eventually.

  • Rewrite your app constantly, but make sure not to fix the major underlying problems.

  • Make sure that users can use your app over the network, even though they never will.

  • Tell at least three people that X-Windows has nothing to do with Microsoft. Because they care.

  • Do not provide a user interface. Instead, provide an API (or, even better, several layers of APIs) that allow users to create their own interface. This gives lusers the ability to make half-assed clones of better designed interfaces for your app.

  • Reassure yourself that, even if your app sucks, at least it sucks on twenty different platforms!

Next time, I will show you how to write an X11 Window manager.

UPDATE: If you like the tutorial, then please digg it. Let's show all the lusers how X11 application writing is really done!

Monday, March 2, 2009

Rants and Laughs 9

Well, it has been quite a while since I've posted one of these, so it is time once again for another Rants & Laughs section, where I review the goings on of the luser community at large.

  • Well, it looks like Linux is having some problems with battery life. Apparently, one luser reported that he could only get 75% of the battery life that he could under Windows. I am sure the new tickless kernels will fix everything.

  • Of course, what Linux really needs is another web browser. Nevermind that there area already thirty of them, and they all suck in different ways. This bunch of freetards can certainly do better than the freetards of the past!

  • Here is a good list of things you need to know in order to use Apt properly. I am glad that FLOSS is so easy to use!

  • CNet thinks Microsoft should fear Ubuntu's cloud computing efforts. Suuuurrreeee!!!! Folks, let's be honest with ourselves here. Cloud computing is nothing more than the same old Thin Client song-and-dance dressed up for Web 2.0. The problems with cloud-computing are the same as the problems with traditional thin clients: the fact that the client is useless without a net connection. Consumers are going to drop cloud computing the first time their net connection fizzles. It is snake oil. Move along.

  • LinuxJournal proclaims that Frozen Bubble is better than MS Solitaire. Wow! Freetards have topped a fifteen year old card game! This is truly the year of the Linux desktop!

  • Some freetardette wants drivers for the Eye-Fi card and says the response 'no one uses Linux' is not good enough since she uses it. Okay, Ms. Freetard, here is a response you might like better, "No one uses Linux except you and a handful of other lusers. The rest of the market (i.e. 99.1%) does not care about Linux support. Now, please take your complaints somewhere else; we are trying to make money here."

  • Another luser blog asks how green is Linux? Well, not as green as it should be considering the aforementioned battery life issues and the miserable ACPI support on many motherboards.

  • The freetard from the previous post is back and whining that vendors should brand 'Linux compatible' on their hardware so that the 0.91% of the market will be able to more easily tell if they are wasting their money or not (on Linux).

  • Well, apparently some luser tried to get QuakeLive working on Ubuntu. Sure, you have to copy some DLLs from Windows XP, which means you need a legal copy of Windows XP, but it works right? Well, sound and input work, but there is no video, which is not a big deal if you are blind! Yeah, that is definitely 95% working.

  • Finally, here is an example of open source development done right. Ubiquiti Networks is offering 5 prizes totaling $200,000 for the development of a GUI for their Router Station. I bet they will get some good projects back.

Wednesday, February 25, 2009

What Game Are You Playing?

Okay, I have been sitting on this article for a while, but I needed some time to recover from the huge, gaping maw of freetarded insanity contained in this article. Whew! Well, better now than never!

He starts off with the standard luser masturbatory moaning: "Ohh, OHHH, Linux, you're so ..... PORTABLE!!! MMM!!! Your license is so, OOOOHHHHPPPEN!!"

Finally, he gets to the meat of his freetarded idea.

Here's the idea: All PC Games should first be built to work with the GNU/Linux Universal Operating System.

My eyes must be deceiving me. Let's see this idea again.

All PC Games should first be built to work with the GNU/Linux Universal Operating System.

WTF?! This is considered an idea?! It sounds like nothing more than the wish-fulfillment fantasies of a demented freetard (probably because it is)! What kind of two-bit justifications does this luser have for such an insane suggestion?

The game would simply have an installer that would install GNU/Linux on the host platform and to enable the gamer (sic) to be played on the host. An example of this ... is ... called wubi (Windows-based Ubuntu Installer). The wubi enables users to install GNU/Linux as a program into the Windows OS.

Now, far be it for little ol' freedom-challenged me to question a plan such as this one Great Lunix Evangelist, sir, but it seems like there are some problems you have not considered. For instance, this WUBI only provides a way to install Linux onto a preexisting Windows partition. The user still has to run Linux stand-alone and face all the driver difficulties that result.

Since GNU/Linux is Universal, this could open up the game to just about any platform because the user would simply use the game installer to install GNU/Linux along with the game to their system.

With power of Linux, you can run Crysis on your cellphone!

Running games in this fashion would put an end to the need for PC game makers having to port their games to different host Operating Systems because all games would be built to work in the GNU/Linux Universal Operating System.

Yes, let's solve all of our porting problems by targetting the operating system with a 0.91% marketshare!! Great idea!!

Using this type of system would revolutionize the PC gaming industry, and broaden the market for the game because it could run on many different types platforms. Increasing the availability of the games would equate to increased sales of the games.

Just how big are those other platforms anyway? To reach ~95% of the desktop market, you only have to port your game to two platforms: Windows XP/Vista and OSX. Chasing after Linux will just cause you to wind up like Loki. There are certainly ways to improve PC gaming, but targeting Linux is not one of them.

It's sort of like the example of RAMBUS RAM vs. SDRAM. Since SDRAM was a more open standard than RAMBUS, more hardware mfgrs were able to make SDRAM and so it became cheaper and more widely used to the point that it snuffed out RAMBUS alltogether.

Yet Linux has been freely available for nearly 18 years, and it is still has a shitty marketshare. Something tells me that your metaphor has some problems.

Another example would be Henry Ford's mentality of making cars more affordable and selling many more cars than when they were only available to the rich.

This is relevant to Linux, how?

This method of making games would also help to protect gaming systems from becoming obsolete, which would be beneficial for both the gamer and the game maker.

Because you never have problems running old applications in Linux!

It is articles like this one that remind me why I do this.

Errors? What errors?

We all know that a Linux is a 'hacker's operating system', which means it is only friendly to fat, pimply men hopped up on Twinkies and Mountain Dew and who last showered when their Mom forced them to three weeks ago.

Here is a good example of the differences between a normal person and a freetard.

When normal people have computer problems, they generally take their computer into Best Buy and let the Geek Squad figure out what the problem is.

Lusers, on the other hand, mostly end up having to personally fix their systems when they break, since they are compelled both by their need to maintain their reputation in front of the rest of the basement-dwelling lusers of the world and by the fact that nobody else gives a fuck about Linux.

Even if others did care about supporting Linux, which distro(s) should they care about? Even if Best Buy supported the top 5 distros on distrowatch, there would be a mass cry of preteen voices and lusers by the dozens would write thousands of furious, barely legible blog posts whining that their insignificant Ubuntu mod was not supported.

Of course, not everything is peaches and ice cream on this 'hacker's operating system.'

One of my readers has kindly provided us with an example of what a luser has to go through to fix his system. Let's take a glimpse of this sad, pathetic world for some cheap laughs, shall we?

I like what you are doing. We Linux geeks need a dose of honesty and reality in order to improve. Public humiliation is sometimes effective, but we Linux geeks are good at disregarding the opinions of the ignorant masses.

My favorite Linux issue: the secret hidden error messages that many Linux apps produce (or not).

When some Linux app suddenly disappears from the screen, I normally just utter "fucking Linux" and start it up again. But sometimes I have the temerity to actually go looking for the problem in the numerous error log files. This is usually a waste of time, because one of the following is true:

  1. There is no message, at least not in any of the places I know where to look, or findable within the time I am willing to spend.
  2. Something that may be relevant can be found, but the message is incomprehensible (probably a leftover debug trace from a programmer).
  3. There are hundreds of messages in the log file (with no time stamps) and I give up trying to find anything relevant to my problem.

A window manager like Gnome should pop up a message box anytime some app writes to stderr or gets a segment fault. This is apparently too much of a bother for the Gnome geeks to implement. After all, they already know where to look when things go wrong, or they always run their apps from a terminal window so they can see stderr outputs and other debug traces.

Here is the (hidden) .xsession-errors file from my current session. I can see why it is hidden and why the Gnome geeks do not want this shit popping up in my face: it is too embarrassing.

/etc/gdm/Xsession: Beginning session setup...
Setting IM through im-switch for locale=en_US.
Start IM through /etc/X11/xinit/xinput.d/all_ALL linked to /etc/X11/xinit/xinput.d/default.
Window manager warning: Failed to read saved session file /home/mico/.config/metacity/sessions/10cd57d1a46f90bd05123554431253037400000057900018.ms: Failed to open file '/home/mico/.config/metacity/sessions/10cd57d1a46f90bd05123554431253037400000057900018.ms': No such file or directory
Failure: Module initalization failed

** (nm-applet:5933): WARNING **: No connections defined
seahorse nautilus module initialized
Initializing nautilus-share extension
Initializing diff-ext

(gnome-panel:5926): Gdk-WARNING **: /build/buildd/gtk+2.0-2.14.4/gdk/x11/gdkdrawable-x11.c:878 drawable is not a pixmap or window

** (nautilus:5927): WARNING **: Unable to add monitor: Not supported
javaldx: Could not find a Java Runtime Environment!

(soffice:6243): Gtk-WARNING **: GtkSpinButton: setting an adjustment with non-zero page size is deprecated
Nautilus-Share-Message: Called "net usershare info" but it failed: 'net usershare' returned error 255: net usershare: cannot open usershare directory /var/lib/samba/usershares. Error No such file or directory
Please ask your system administrator to enable user sharing.

If you did not understand a word of that, consider yourself a sane, well-adjusted individual. Otherwise, there is still hope. You can break the addiction to freetardism, and a good place to start would be reading through my archives and the archives of Linux Hater.

Tuesday, February 24, 2009

Crash & Burn

Lusers always try to make up for Linux's other shortcomings by saying 'well, at least Linux doesn't bluescreen every thirty minutes', since their last experience with Windows was in the 95 days. They are right about one thing: Linux never bluescreens; it simply locks up.

Well, Windows has come a long way since XP was introduced, and the only lock-ups I personally have experienced have been game-related. Ubuntu also has a habit of crashing on me while I am trying to play some games, especially using Wine.

Now, I know some of you lusers are going to quibble over technicalities. "Oh, well Linux, the kernel, did not really crash, X11, the application, crashed. To fix it, you merely have to press CTRL-ALT-F1 to get to a gheto-ass console; enter your user name & password; type 'sudo su - '; type kill -s 9 `ps -A | grep gdm | awk {'print $4'}` ; type /usr/X11R6/bin/gdm; type 'exit'; type 'exit' again; press CTRL-ALT-F7, and then login normally."*

These lusers are completely missing the fucking point. Leaving aside the fact that it is probably faster to just reboot, it betrays a lack of understanding of the common user. To most people, if the keyboard and mouse are not responding, then the computer has crashed. Period. To hide behind technicalities is simply deceitful, and only Micro$0ft lies, right? Riiighhttt.

*Note: I know some lusers are going to pop in and critique my commands, 'no, you put the quotation marks around the brackets, not vice-versa, or no its /usr/bin/gdm -someoptionthatshouldbethedefaultbuttheauthorisaluser'. To those people, I say, get a life!

Wednesday, January 28, 2009

More Grafix!!

Lusers are always screechin' that they could write kick-ass Linux drivers for hardware if the designers would just open their specs. Nowhere is this heard more than in the arena of graphics cards. For years, Linux zealots begged the big companies to release specs, and they would have kickass drivers written for them in no time.

Well, along comes Intel, who believes the hype and releases the source code for their graphics cards. It has mostly worked out, and there have been few issues (I have still heard people report Suspend issues with Intel cards). However, Intel GPUs have a reputation for being simple and not meant for high-end performance.

Well, that is just dandy. Now, AMD/ATI comes along and says, "well, if Intel is going to give in to a <1% markeshare os, then we will too!" They opened up a bunch of their docs and let the freetards loose on them. Some of these docs have been available since September 2007. Most of them have been available for nearly a year.

Let's check up on the progress, shall we?

Wow! Support for textures and pixel shaders is at best mostly done on R500 and later cards. Those are hardly necessary for modern games (or any 3d application), though. Are there any other problems?

The following subsystems have not been implemented yet or show some limitations:

  • 3D acceleration is only implemented on R5xx and RS6xx upto now. Also no XVideo on newer chips (needs 3D engine for scaling). Still, fullscreen video is working fluently with shadowfb for many users. An experimental 3D bringup tool is now available for testing.

  • No TV and Component connector support so far.

  • Suspend & Resume isn't completely tested, but works on a variety of hardware. Your mileage may vary. Note that typically you need some BIOS workarounds on the kernel command line, ask your distribution for that.

  • No powermanagement yet. Depending on your hardware, the fan might run at full speed. This turned out to be really tricky.

Well, the freetards have managed to create a little 3d acceleration, and they have not fully managed to fix the major problems with the fglrx drivers: bad ACPI support. It only took them 15 months to do it! I feel freer already!

You Get What You Pay For

The following paragraphs are from a rant an anonymous contributor sent to me. To comply with his request, I have edited it where I deemed appropriate.

There are two major problems that Linux faces concerning its spread on the desktop:

1.) Applications

2.) Drivers

Both problems will not change in the near future, so 2009 will not be the year of Linux on the desktop. Neither will 2010. But what is the real problem? The real problem is ignored by those who are "in charge". What do I mean with that?

1.) Applications

Desktop-Users need commercial applications. That's just the way it is. The extra ten percent of features that makes an app usable for your average Desktop-User are the 10% that every developer hates; those features are hard and boring to develop, and implementing them is just no fun. You need to pay developers to implement them.

Do you really think that something like Photoshop Elements is going to be created by the community? My father, whose hobby is photography, shelled out 70€ for PE. He does not regret it, even though the activation is a PITA. Why? It just works: it works with his camera; he gets results fast; there are a bunch of tutorials and books available, etc.

In Linux we are stuck with Gimp. Sorry, but no cigar! PE calibre software will NOT be created by the community. It just takes TOO much manpower, TOO much work. No one is coding that in his spare-time.

This kind of software will also not be created by an open-source company. There is no business-model that would supports the effort. Just imagine if Adobe released PE as open-source - you think that people would still shell out 70€ for a boxed-version? Nope, people would just copy it. There are some exceptions. For example, Mozilla receives their money from Google not from their users. 

Sun supports OpenOffice, but it still has issues. The spell-checker sucks even in the 3.0 version (at least for German). BTW, you can buy an add-on for OpenOffice: the Duden-Spellchecker. It is closed source and costs approximately 25€. Apparently, Sun bundles it with StarOffice, so if you buy a boxed version of StarOffice, you will have a proper spell-checker.

See? That is another typical example of the difference between open and closed source software. Will a great spell-checker be created by hobbyists in their spare time? No! It is a repetitive, boring task, and coding skill alone does not suffice. You also need language experts. Will they work for free? No! Then who will pay them?

Other examples are nice fonts, Video-Editing Software, Audio-Production Software (Cubase or Logic created as open-source from the community? Come on!), Handwriting Recognition, OCR, Home-banking Software and so on. For each of the software programs mentioned, there is a half-assed open source clone. All of them each are not taken seriously by those who really work in the respective field. Can Gimp replace Photshope/PE? What about Ardour for Logic/Cubase replacement? Is there an alternative for Adobe Acrobat? I don't think so. 

Anyway there are two conclusions you can draw:

1.) It is simply not true. Gimp rocks, and I have to relearn everything I know, and I am not willing to change, and it is all my fault that I have problems, and FLOSS basically rocks. Anyway, the makers of the Distribution have provided, from their repositories, me with every software program I will ever need. 

2.) You should try to make it _easy_ for ISV's to target Linux as a platform.

Apparently lots of Linux users choose #1 and write long, screeching blog posts about the benefits of apt-get. Unfortunately, the major players are not listening. If you are an ISV, shipping software for Linux is not worth your time and resources. Either you should, like Opera, test your binary against zillions of distributions, or you should not ship a Linux-version at all.

Even for open-source developers, this state of affairs sucks. Take for example Anki, which is one of my favorite tools for learning a foreign language (I never said that open-source apps will never work). Anki is basically the effort of one developer. It is a nice application, but it is also not as "big" as a full-blown commercial application (i.e. those that you must shell out 70€ for). Basically, Anki is donationware. Apps like Anki, which were shareware ten years ago, are now usually developed as open-source+donations, and it halfway works. However, these applications are usually neat yet small tools: 7zip, Anki, Miranda, etc.

The developer of Anki does not have the time to test his cross-platform application against zillions of distributions. For Linux, there is only an Ubuntu package. However, this guy develops fast, and there is usually a new version every month (or sometimes every couple of weeks). The _one_ Windows binary works on all versions of Windows. The _one_ MacOS binary mostly works on all major versions of MacOS.

There is no package for the distribution I am currently running, OpenSuse. It is not in the repository, and if a software package is not in the repository, the user is _lost_. What are his options? Should he recompile every time a new version is released (sometimes every two weeks), because make uninstall is known to just work? I tried to alien the ubuntu package, but it does not work. I also tried to manually compile the program, but the compilation failed because of some weird dependency problems with QT which I could not understand. In fact, it is easier to ship a piece of software for a Hackintosh than it is to ship it for Linux. Think about that.

This problem will not change. Lusers do not like to shell out money for applications, and they do not like commercial applications in general. Software makers do not ship anything for Linux because they have no clue what they need to ship. Common users do not use Linux because the commercial applications are not present. Therefore, the situation will not change. Regarding LSB, I think we have covered that already.

2.) Drivers

If you are some independent maker of hardware and want your device to run under Linux, you are basically required to open source your drivers. Any other option will not work for your users. There is no way you could do something (yeah, I know it sounds like a completely weird idea) like shipping a driver-cd with your product. You could also go the NVidia route. That might work if someone really wants your hardware to work under Linux, and you are well compensated for your efforts. NVidia is an exception.

Pushing people in this manner to open-source their drivers actually works in the server world. If some Fortune 500 is using Linux on their servers, and they buy 1000 servers with Intel mobos, those boards are required to work. The company shells out a lot of money, so there is a real financial incentive for Intel to open-source their drivers if they want to sell their hardware. This works because Linux has a significant market-share in the server-world.

On the desktop, Linux is not even remotely in the position to make these demands. All the freetards, however, act like the unknown maker of your webcam absolutely _needs_ Linux-compatibility for their device to be sold. Do you really think they will open-source their drivers _ever_? The freetards are saying, "I am insignificant, yet the world should adapt to me. I will never adapt to the realities of the world!" That attitude just does not work.

There is still no stable driver-ABI, so the driver situation on the desktop won't change. Linux on the desktop? A joke.

Unfortunately, both of these problems are not technical issues. They are instead dogmatic issues. This is why Linux will not take off on the desktop. These two problem have existed for 15 years, and if you install Linux on your desktop today, you still face the same problems.

Fifteen years ago winmodems were the problem. Today it is wireless lans. Fifteen years ago your GDI-printer did not work. Today your printer/fax-combo does not work. Fifteen years ago you wanted to install the new Netscape 4.x. Nowadays you want to install Firefox 3, but your distro is not shipping new packages for the next six months.

Linux on the desktop is a joke. Nothing more.

I Don't Want to Go on the Cart . . .

Yeah, yeah. I know it has been over two months since the last post. First I was busy with exams, then I was busy with the Holiday season; now I am back in school. I will try to do better and post something more often. Anyway, this blog is not dead. My hatred of Linux is still red hot!