Thursday, November 20, 2008

Why Linux Is Not More Secure Than Windows

Alright, every once in a while, I come across a truly stupid Linux article and have to give it a rant of its own. This stupid post goes on to describe how Linux is more secure than Windows. Let's eviscerate this mismash of stupidity and FUD, shall we?
Since the 1970s Unix has had a proper permission based system.
So it has an old feature. Big deal.
Every computer has an “administrator” account called “root”.  The root account can perform any function whatsoever on the system.

That does not seem very secure. If an attacker can get the root password, the system is completely at his mercy. Plus, Windows NT/2K/XP/Vista has this feature as well.
You have access to one single directory known as your home folder.  To do any task, or for any program to execute any task, outside of your home directory, you will need to give it the root password.
You can setup this feature in Windows XP and especially Vista. Vista makes it easy to install and run as a limited user, and if an action requires administrative privileges, they are only a sudo away. Even in XP it is not terribly hard to create a limited account. I think the default account is 'Power User' who can install software but is still restricted in some ways.
Every file, program, etc.. has a series of three permissions on it.  One for the user, two for the group, and three for world (or everybody).  Each of these series has 3 different types of permissions, read, write, and execute. 
Only 3? Uh, Dude. I think you should Google something like Windows ACL (I just did it for you). You should find a site like this one.
Again, the registry, by default can be editted by anyone or any process running.
Since I wasted so much time with Linux, I am quite unfamiliar with the innerworkings of the Windows registry. Are you telling me that HKEY_LOCAL_MACHINE can be edited by users of any privilege level? I highly doubt it; otherwise, it would have definitely been listed as a criticism. Wait, I think you are still talking about the default permission thing, aren't you?
Linux doesn’t have a registry, it has a folder which contains configuration files (one file per application) that controls settings for JUST that program. 
Dude, ever heard of Gconf? It has all the features of the registry and all the problems.
Because open source software is open to the world, the code has many many more eyes on it.  So bugs and vulnerabilities get patched sometimes two and three times faster than corporations are able to patch theirs.
Yes, when a security flaw is found, the code can quickly be patched, but this is true in the proprietary world as well. The real test is getting the patched binary out to the users. When a major problem is discovered, like the WMF vulnerability a few years ago, Microsoft can move quite fast.
The problem is, the grand majority of users have no idea about computers, software, and technology.  They know what they need to know to perform their tasks and that’s it.
Yes, this is true. This is also something that many lusers don't seem to understand.
With Windows, you go scouring the internet looking for that program that will remove spyware, or help you balance your checkbook, or allow you to talk to friends and family over IM.  This is problematic as most people are unaware of what web sites offer legit, virus free, spyware free, applications that do exactly as advertised (for free or paid for).
Well, you could help them by giving them a link to download.com (I did it for you again). I have heard that they run the software through some checks to prevent uploading malware. It is easier than teaching them how to use Linux.
In Linux it’s a bit different.  There is one place to get a majority of your software, and this same place has the ability to update all your software as well.
Of course the binary they are downloading is not always exactly the same as the one compilable from the code released by upstream. It often contains patches, and sometimes these patches can cause major security problems.
Many distributions use what’s called “Secure Linux”
Uhhh.... Mandatory Integrity Control? It was included by default in Windows Vista, which was released nearly two years ago.  Where have you been?
Again, when you install proprietary software, you never really know who has access to what.  Since the code is closed off, the maker of that software can include any backdoor they wish.
Yes, but if they screw up, the backdoor will be found, and if the backdoor is found, then people will be hesitant about using the software. A software company that needs buyers to give it money to survive will have a vested interest in not screwing its users (at least, not too much). Sure, freeware developers can include spyware as a revenue stream, but this just illustrates the principle of TANSTAAFL
And unlike the NSA developed SE Linux, this code is held private so no one can review it
There are many ways to find backdoors: running applications through a debugger, monitoring network connections (the big one), etc. Since this potential Windows backdoor was found, it looks like it is possible to find backdoors in closed-source software. It is also possible to include backdoors in open source software; just look at the Underhanded C Contest

So, basically, these are the security enhancements that Linux has over Windows? Call me a Micr0$0ft $hi11 if you want, but I do not think these 'advantages' outweigh Linux's other problems.

Rants and Laughs 8

Alright, I know I just did a Rants and Laughs a few days ago. It is nearly the end of the semester, and I am quite busy. Anyway, here is some more fodder for all of you.

Well, it looks like the Indrema clone may have a few problems. and it is shipping with Windows as well, but the developer hopes that this will provide a 'stepping-stone for Linux EVO gaming.' Suuuuurrrrre it will. Score one for Linux gaming!

Linux Journal posts a message from a 'clueless Linux user.' One of the major problems with Linux is that it is a piece-meal system, so if someone complains, the organization that receives the complaint can just say 'not our fault; it is the fault of that other organization whose software we use.' Yes, I know he should not have called a magazine, but still.

Some luser asks if there are any good Linux video editors that do not crash. You got me there!

Some peter-puffers discuss the link between opening-up and decline in share price. It looks like open sourcing really is the Hail Mary Desperation Pass of software.

Some luser thinks that good things are on the horizon. WTF?! It just looks like a bunch of distro releases to me. Big fuckin' deal!

Finally, here are two user-submitted rants.
My rant about the "it's free, but download the add-in yourself" mentality :

"In Indonesia, personal internet connection is much worse compared to
Singapore's or Australia, and rarely used due the relatively expensive
tariff and slow speed. So, having a software that require a direct
internet connection is such a burden. For example, antivirus software
that don't have offline update functionality is doomed to fail in most
Indonesia's rarely connected computer user.
That's one of the reason I use Mint. Instead of painstakingly
connecting to my university internet connection, all I have to do is
install it, and forget it. MP3, FLV, WMA, name it. I don't have to use
repository disc or other cumbersome methods.
The same things applies to Novell's Go-OO fork. After installing Sun's
OpenOffice, I found that there's no spelling checker in Indonesia.
Great. There's no extension available in their site either. But all
Go-OO releases I've tried and love, include Indonesian and many others
by default. And there's also that hybrid PDF export that should be
very useful, presentation minimizer and report builder (developed by
Sun but they don't include it in their own release, strange), directly
integrated in Go-OO. All I have to do is install and forget.

How long will this continue? Even if I have a fast internet
connection, I'll prefer releases that include useful stuff by default,
not the one that give no clue how to add those."

Is there any other field with this mentality in Linux-related?
Here is another rant by thepld:
Found this gem on Amarok's blog:
http://amarok.kde.org/en/node/570
After some discussion, we have decided to extend Roktober since we are so far away from our goal and we think that maybe part of the problem is not enough promotion, so if we extend a few weeks maybe we can get this going. Not everyone follows the developer blogs, so if anybody missed the blog entry put up by our treasurer regarding Roktober, here are some highlights (or read the full entry):
  • we have seen a huge fall-off in donations from outside the EU and we're wondering why this is;
  • we are planning on giving two prizes this year so we are giving entries in the drawing based local currency;
  • towards the end of last year we spent over €2000 to send 12 people to aKademy;
  • the project spent about €1500 on technical and administrative items like server hosting, domain administration, develop resources (books) and hardware;
  • in addition to aKademy, we spent over €3500 attending free software conferences around the world
  • each developer/contributor team member was given two t-shirts. A small thank you for the large amounts of time put into the project by volunteers who are doing this for fun, not profit.

Apparently, they're having problems getting pity money to finance their software. Please note that they spent over €5500 to send people to FOSS conferences. What the hell do they even do at these conferences? Oh wait, I remember: Pat each other on the back about how wonderful free software is, and celebrate the advent of the Year of the Linux Desktop ™. In fact, only €1500 was actually spent on anything relating to actual software development.

I also enjoyed the subtle jab at non-EU countries. I think I know why no one is donating: We actually have our priorities straight and recognize there are more problems facing the world than having a fucking open source media player that can't even do replay gain. And hey, weren't OSS projects supposed to make money by selling support? They must have missed the memo.

Monday, November 17, 2008

Rants and Laughs 7

It is time once again to see what is going on in the freetard community.

  • PC Authority has an exclusive interview with Richard Stallman. Stallman's comments suggest that he is still off his rocker.
  • Freetards have finally mau-maued Adobe into releasing a 64-bit Flash plugin. Reading the release, I get the vague impression that Adobe only did this to shut up them up. Um, Adobe, x86-64 is so last Tuesday! Where is my Linux ARM port? I love the reference to my predecessor BTW. 
  • Freetards cheer that the US Navy has embraced open source. First of all, the memo said that the Navy would adopt systems based on "open technologies and standards." This does not necessarily mean that they will adopt open SOURCE.
  • NEWSFLASH!! The Linux kernel has poor documentation! Kernel developers are unsure how to fix this. Linux wants better Release Notes; some suggest example code for new features (that would be helpful); some suggest scrapping most of the in-tree documentation. If kernel developers want better documentation, they will have to FULLY document what is already there and have the diligence to keep the API stable for longer periods of time. When you can develop a kernel module with your 2-3 year old copy of Writing Linux Device Drivers or Understanding the Linux Kernel, then I will say the situation has improved. Until then, have fun writing camera drivers with the Video4Linux1 API documentation aspiring kernel hackers!
  • Happy 25th Birthday GNU! Here is your birthday greetings from washed-up actor Steven Fry! It has been 25 years (well, it will be on January 5, 2009), and you still have not produced a complete operating system! How is the HURD and that Lisp Window Manager you wrote about coming along BTW?
  • Here is a list of all the games natively supported by Linux. Wow, 373 titles! That is like 1/100th of the number of games available on Windows. Also, it kind of fudges the number a little, since it includes every Linux emulator you can run a game on and every single Linux compatible Doom or Quake engine. Linux is truly the next generation gaming platform!
  • Another luser writes that Theora will replace Flash. Yeah right! If you can get Youtube to even support Theora playback, then we will talk. Oh, what's that? There are some issues with Theora that need to be 'ironed out' before it can present a credible threat to Flash? "Despite being supported by Opera and Firefox, Theora has a number of challenges ahead. The first lies in its performance -- both the encoding time and the video quality trail behind the common XviD/DivX-style MPEG-4 ASP codecs, let alone next-generation HD codecs like H.264 and VC-1. " Well, maybe you can get Nvidia to help you out?

  • Here is an article that lists the problems migrating from Exchange to OSS solutions. "One reason is that none of the open-source programs are really ready to serve as drop-in Exchange replacements. There's also some additional work that needs to be done, and it's not work that Windows administrators are used to doing. Even a veteran Linux administrator, though, might find setting up a full-powered Exchange replacement for a good-sized company a challenge. For example, Scalix 11.4 requires Apache, PostgreSQL, Tomcat, and either Sendmail or Postfix to be installed before it can work. That's not hard, but when you factor in the need for managing disk performance it becomes more of a problem. E-mail server applications, have trouble scaling, because of disk performance bottlenecks. To run a groupware server for more than a small business really requires shared disk arrays. Put it all together and you have a serious Linux system administrator's job, and it's not one that a former Exchange administrator is likely to be able to handle." TCO, the bane of lusers everywhere, has struck again!

Friday, November 14, 2008

Gentlemen, we can compile him. We have the technology.

Today, esr posted his thoughts on the Linux Hater's Blog which, among other things, led to a discussion about binary distribution vis-a-vis source distribution. I will now share my thoughts on the matter.

Gentoo Linux was my second Linux distro (my first was Mandrake). I know source-based distribution can have many benefits especially when the distribution is designed to cater to it as Gentoo is. Before the Ricers descend on us, let me say that I know optimization is NOT one of the benefits; you can -funroll-all-sanity all you want, but it is likely to do nothing, crash the app, or make it even slower.  USE flags, when they work properly, and the options are actually supported, can be a great way to customize a system to suit your particular needs. 

When I was new, I thought this kind of customization was awesome because it let me only use what I wanted to use. Since I used plain Fluxbox (what fun would Gnome or KDE have been with Gentoo; plus it compiled quicker), I would disable any support for KDE and Gnome, since I did not use them, and if I did not, it would drag in a whole bunch of libraries, and it would take 10-15 hours to compile.  I do not want to think about the amount of time I spent messing with various USE variables to reduce the amount of 'unnecessary' dependencies for an app I wanted to install. Of course, I would have to remember to add the USE variable modification to /etc/portage/packages.use for that particular application, or it might screw up the next time I updated the system.

Ahh updating! I remember that well. Gentoo was a bitch to update! It always appeared to be simple: "emerge --sync; emerge --update --deep --newuse world". However, it took many hours, and the computer was rendered unusable for most of the time. Since it was such a pain, I would go months without updating the system; then I would have to go through hell because the developers changed a whole lot and I had to do the emerge world dance two or three times! Of course this was the best case scenario. If something failed to compile . . . 

Apart from the pain of installing and updating the system, I remember the rest of the time being a breeze. At its peak, around 2004-2005, Gentoo gave me the best user experience with *nix I ever had. Most applications just worked. I had very good things to say about its 32-bit chroot. I could compile lots of programs, and my desktop was still responsive. Multimedia support was bar none! Xine + MPlayer + XMMS could handle anything you threw at it. Gentoo's versions made it easy to add support for MP3, DVD, WMV, etc. I still have not found a better or more versatile pair of media players than Gentoo's Xine (for DVDs) and MPlayer (for everything else). MPlayer could play practically any file you threw at it no matter how corrupted it was; sure, sometimes there would be no sound, some sound, skipping, linear viewing only, random freezes, but it WOULD play). No matter how much experience I have with freetardism, I will never understand why the distros seem to be dumping these two great players for the utterly brain-damaged GStreamer (which is another rant for another time).

The packages seemed to work so well that, now that I think about it, I am not sure if I completely agree with Linux Hater that distro-maintained packages (or ebuilds in Gentoo's case) are NECESSARILY a bad thing. Maybe the package maintainers have more influence over the quality of the application than most people realize.  Maybe whatever Debian/RedHat and their followers do to produce binaries is the really fucked up thing. Of course, this site lists certain, uh, problems people have had with Gentoo, so maybe LHB's point still stands.

However, not all packages worked well. If I ever used ANY masked packages, I was preparing myself for a world of hurt.  Since masked packages usually featured a lot of masked dependencies, I would usually have to install a bunch of unstable applications just to run one I wanted. Often, I managed to install all or most of the dependencies, but the app I wanted or one of its last dependencies would not install or run properly, so I was then stuck with a bunch of unstable libraries with no obvious (to me) way to go back. This was actually a general problem with Gentoo. By emerging only my essentials, I thought I was getting a lean, mean optimized machine, but as I added new applications, I would have to add a bunch of new dependencies. When I unmerged that app, the dependencies would still be there which soon made my Gentoo system just as full of cruft as any other distro. Sure I could "emerge --depclean; revdep-rebuild", but I would first have to update the entire system (which I hated doing because it took forever), and sometimes the orphaned dependencies would trouble the install. Now, I know that some of this mess was my fault for installing unstable but 'shiny' applications. I wonder how much of the Linux annoyances are caused by the lusers themselves who scream for the latest ub3rc001 but highly unstable application? If most of Linux's usability issues arise from the demands of the users themselves, then FLOSS has a major systemic problem with mass-market adoption.

Now that I have reminisced enough, let's get back to my original point regarding binary distribution. Let's take the (admittedly anecdotal) information above and apply it to source-based distribution as a whole. This model will treat all maintainers of upstream projects as a single source-based distribution. This model will treat the package maintainers of binary distributions as the users of the source-based distribution.  Now, first off, we can see that the package maintainer's task is a bit harder since he does not have automatic dependency resolution. Sure, the project documentation usually lists its dependencies, and most heavily-used libraries are already packaged in most distributions, but it still does not beat good ol' emerge app.  Now, the packager is in the same boat at the Gentoo user. He knows the specific needs of the distro better than upstream knows them, but upstream knows the software better than the packager knows it. Like Gentoo users and their USE variables, the package manager can add patches to the code and configure it with various options to better integrate it into the distro, but sometimes the modifications will break certain assumptions upstream has made, and all hell will break loose

Which one should we trust: upstream/source_distro or the packager/source_distro_user? I think we should trust upstream more, since they know the code better and can better avoid doing stupid things. Now, the best solution is when the maintainer and the distro packager are the same people, since they can then develop their app with integration in mind. This is probably why FreeBSD, despite orders of magnitude less funding, always felt more coherent and polished than any Linux distribution.

Of course esr does list some relevant problems with binary distribution:
I actually used to build my own RPMs for distribution; I moved away from that because even within that one package format there's enough variation in where various system directories are placed to be a problem. Possibly LSB will solve this some year, but it hasn't yet.
First, why were you building RPMs? What advantages do RPMs have over plain old TGZs except signature support and various metadata that could be included in the filename? By only building RPMs, you were excluding all the other non-RPM distros for no appreciable gain.

Second, wasn't the Linux Filesystem Hierarchy Standard supposed to solve this by standardizing the system directories? Wasn't it released 15 years ago? If that didn't work, the the Linux Standards Base should definitely have fixed it, but it seems to have failed. If, after 15 years, you still cannot determine the location of a mail spooler or logfile, then OSS has a MAJOR problem!

Third, what exactly did you need in those system directories anyway? In your book, the Art of Unix Programming, you wrote about this
Often, you can avoid this sort of dependency by stepping back and reframing the problem. Why are you opening a file in the mail spool directory, anyway? If you're writing to it, wouldn't it be better to simply invoke the local mail transport agent to do it for you so the file-locking gets done right? If you're reading from it, might it be better to query it through a POP or IMAP server?
If you were looking for applications, couldn't you just use /usr/bin/env? I am sure that is present on any Linux distro worth mentioning. If you were looking for libraries, then maybe you should statically compile your program.

I know there are some downsides to distributing statically compiled binaries. They take up more RAM and Hard Drive space, but RAM and Hard Drive space are both really cheap nowadays. Even low end notebooks feature 2-3 GB of RAM and 300-500GB hard drives; the typical user has RAM, Swap and Disk space to burn. The other problem with statically compiled binaries is that it is a bigger problem to update a library with a serious bug or security hole. However, the major proprietary software applications for Linux also have this problem, and they seem to have done okay. If the software developer is halfway competent, he will be tracking the development lists of all the libraries his app depends on, and he can then issue an update as soon as a patch for the affected library is released. Plus, open source has the advantage that, if the developer is being lazy or whatever, anyone who cares can (theoretically) download the source code for the app and its dependencies and produce a fixed binary. However, all of these downsides melt away after the feeling of navigating to a project's home page, downloading the Linux binary, installing it and running it just like in Windows and OS X!!!

In short, binary distribution has many advantages over source-based distribution, and Linux crusaders would do well not do dismiss them.

Sunday, November 9, 2008

Rants and Laughs 6

Well, it is time to again see what is happening in the Linux 'community' and make fun of them for it.

  • Here is a GTK theming tutorial. It looks somewhat complicated and demonstrates the beauty of GTK at the same time! What more could one want?
  • Why should you try Fluxbox? Because Linux's major attempts at a desktop environment are slow, complicated and generally suck ass. Here's a better idea: try Aqua instead.
  • Some dude configured a Linux print queue for a library that was too cheap to buy a Windows server. All he had to do was edit smb.conf (among other things). Also, apparently the queue cannot display page numbers properly, so unnecessarily gigantic printouts could still go through.
  • ComputerWorld, apparently, cannot quit Microsoft bashing. It seems to be a bit better than the last one, but it still focuses on netbooks.
  • Head marketdroid luser of the Linux Foundation thinks no 'in house from scratch' operating system will ever be created again. She cites that ridiculous study that Red Hat Linux is worth $10 Billion. Leaving aside the ridiculous notion of measuring SLOC, the most basic problem is this: if the OS is worth so much, and it is free, why does it have such a shitty marketshare? Earth to Linux Foundation, the amount of time some wanker wasted coding an app does not give it value; value only comes from a bunch of other people WANTING the app. This is how the market works! If you did not have your head jammed so far up your commie, freetard ass, you would understand that.
  • Ubuntu 8 and OSX 10.5 go head to head. OS X positively crushes Linux on 3d Acceleration. Damn, I have not seen an ass-whuppin' like that in a long time! Of course they give the standard luser excuses, such as Mesa not being optimized or the Intel driver going through some 'radical changes.' They will do anything to keep from admitting a Linux flaw.
  • The Register does a serious review of OpenOffice.org 3.0. Apparently, you should keep your Microsoft Office install.
  • Finally, another hater discusses the fallacy of choice! This is required reading for all lusers!
  • UPDATE: I forgot a really good one. Apparently, the Android G1 phones had a phantom shell. Typing anything into your phone followed by return would execute as a shell command. If you type r-e-b-o-o-t, the phone will reboot. I am speechless!

Standardizing Linux Suckiness 3.0

Well, lusers keep crying that "LSB 4.0 will be the ultimate enabler of Linux on the desktop and seal Microsoft's fate!" Well, looking at the project plan, it appears to be just as useful as all the other releases (i.e. completely fucking worthless). It mostly seems to contain a list of library updates. I was told to read the LSB 4.0 specification, but I cannot find any such thing. The closest thing I have found is this Project Plan.

OK, I was told that it included a way to make packages that would render the distributions irrelevant (something about a dynamic linker, I think). I do not see any such thing. The closest thing I could find was Best Effort Dynamic Linking. Here is the description of the 'Vision' of this wonderful technology.

Right now, the LSB requires a different dynamic linker than the rest of the system. This linker is often not provided at all on non-LSB systems, and cannot be guaranteed to be available even on distros that can be LSB-compliant (if the LSB environment is not installed).

WTF? Why does it even require a separate dynamic linker in the first place?

This is a serious obstacle to acceptance of the LSB by ISVs; no one wants to have to make sure the proper dynamic linker is installed. The tools we have provided to try and mitigate this problem have not been good enough.

To encourage ISV adoption, therefore, we need to implement the dynamic linker change a different way.

Theodore Ts'o has proposed an alternative mechanism for supporting the dynamic linker. In this model, the ELF dynamic linker is the same as for all other binaries on the system, but the LSB SDK embeds some code into the executable--either via crti.o or via an init function called early--which checks if the executable needs to be run with the LSB dynamic linker instead, and re-execs the binary if necessary. This provides a "best effort" system for running LSB applications, which can be ensured to run correctly on all Linux systems regardless of the status of LSB support on the specific machine.

Umm, Theo, how does your system cope with a missing shared-object file? If a distribution does not include the required .so, does your system download it from the internet? How does this help with LSB noncompliance? This certainly does not look like it will make the distributions irrelevant.

Well, what other exciting, cool things will LSB 4.0 add to the Linux Desktop? Hmmm... Not much. It looks like a bunch of library updates. However, the Desktop Module does not cover all the things needed for a good desktop, so let's look at the Multimedia Module. For the lazy hater, here are the features it lists.
  • ALSA (moving from TrialUse in 3.2)
  • GStreamer
  • PulseAudio/SydneyAudio
Wow! It does feature PulseAudio, that performance-sucking wheel-reinventing beast! However, maybe it does not. Thank you for making that crystal-clear, LSB! The thread linked to from PulseAudio's status features a great quote by the way, "Applications having to worry about 4 different audio interfaces when they could just be worrying about 3 is just wasting resources."

Well, call me a cynic, but I think this new standard is the same-old same-old. It does not seem to offer anything all that compelling to make me think "this is truly the year of the Linux desktop!" If the LSB was serious about standardizing Linux, there is one thing they need to do: take the LSB Sample Implementation and make a complete desktop out of it. Then, compel the major Linux distributors to build their distros around the LSB system. All the distros could still add their proprietary touches and package managers, but there would be uniformity at the heart. Sure, the distros might be a little wary at first, but they should soon see the light. Even if the distros had to give up some 'competitive edge' with other distros, the extra revenue gained from a deluge of new customers who heard that Linux was finally not broken into several hundred pieces should more than make up the difference. Also, the distros would incur less inhouse maintenance costs. That alone could make Canonical profitable. This is the same lesson that has been learned throughout history: if you can put aside your petty squables and unite, the potential losses to competition will be dwarfed by the gains you have made. Too bad lusers don't seem to care about history.

Thursday, November 6, 2008

Programming - Hold Still, This Won't Hurt a Bit

Well, Linux/Unix is not meant for the joe-average desktop user. It was meant to be a hacker's toy, and nothing more, so it must be really awesome as a development environment! Well, it has some issues. Linux development may have improved since the Unix Haters Handbook was published, but it still has a long way to go. In this next user-submitted rant, we will see what the issues are.

I have to say that I love a good IDE (Integrated Development Environment). A good IDE should have a logical, easily-navigated UI to create applications. The UI should allow the enduser to easily enter all of the data needed to compile/link the app, and the IDE _should_ do whatever needs to be done to successfully invoke the compiler/linker to create the app.

An example of a good IDE that does exactly that is Microsoft's Visual Studio. It uses fully complete wizards to guide the process, and its underlying tools are well integrated with the IDE. It doesn't use extremely fragile, intermediary text files full of various macro languages to achieve the creation of a "project". When you finish the wizard, Visual Studio directly creates the needed makefile for you. It's easy and painless to develop Win32 apps using MS tools.

An example of bad IDEs that don't do this very well are Linux IDEs. The IDE "wizard" takes you only half-way gathering all the info it needs, then it runs some very poorly integrated (if one could even describe a unidirectional, textual pipe as "integration") GNU tools which, too independently of the IDE, try to analyze your system and come up with the remaining necessary info to create the makefile.

It's painful and difficult to develop apps with a Linux IDE.

A big part of the problem is that the IDE uses a lot of the GNU tools (autoconf, automake, etc), which strive to create "universal" dev files that take into consideration every possible configuration quirk of the thousands of Linux distributions made by Tom, Dick, Harry, and their dogs Spike, Fido, and Spot. But this is a nightmare for a newbie who wants to just compile, link, and run his C/C++ program on his single installation of Ubuntu. He doesn't need to take into consideration some quirk of "Martian Christian Linux", yet another pointlessly trivial variation of Linux which, for god knows what reason, is supposedly better for Martians who also happen to be christians. The GNU tools spewout a plethora of dev files that are humanly unreadable and uneditable by all but the extraterrestrial beings who devised the incomprehensible, needlessly convoluted alien "language" that these tools use. This is bad news if you need to hand modify one of these files because the IDE didn't quite get something right (and believe me, it's a good bet that IDE's "output window" will demonstrate blissful
ignorance as it chuffs out autoconf error messages telling you to hand edit your configure file to add this or that). The IDE isn't smart enough to parse the error output of the GNU tools and follow the instructions given to you, even though it's supposed to be the job of a good IDE to maintain and "edit" the underlying compiler/linker support files instead of telling you to do so. Otherwise, what's the point of an IDE?

With Visual Studio, I never had to actually hand-edit any makefile, nor create _any_ other convoluted text file containing any kind of "macro/shell language" for the purpose of creating a make file. I never had to, because the IDE was capable of completely managing the makefile via data that I entered exclusively through the UI. I don't need to remember what the "text flags" are to enable various compilation features, because the IDE's UI is actually useful (as opposed to just a thin wrapper over some hideously designed, command line tools that use their own "logic" to determine what "text flags" ultimately get passed to the compiler/linker).

If you go to the GNU website, you'll find tons of "manuals" that explain to you in excruciating detail what each tool is for, and how it works. Here's a URL to get you started on your quest through dozens and dozens of barely comprehensible (and almost pointless) pages of instructions:

http://www.gnu.org/manual/manual.html

Now I suppose someone can write even more detailed tutorials than these pages, and take you along the way at a slower pace (although it would probably take a person a few years of work to come up with such a volume of text), but here's the bottom line:

These are very complicated, convoluted, and frankly, unintuitive tools to use. You can read all these manuals until your eyes fall out, and your brain can maybe retain enough of the voluminous details of each tool such that you'll actually be able to do something useful with them, maybe even on a daily basis (but with a stiff penalty against productivity as you spend too much time plowing through the voluminous GNU docs just to figure out how to enable/disable even one compilation feature). But they will never be easy to use. They will never be as easy as .NET. Never. They will never be half as easy as .NET. They will never be a quarter as easy as .NET. These tools are designed to take into account so many variables/installations/quirks in the entire Linux universe, that they can never be anything but tremendously complicated and convoluted and unintuitive. Think about it. Someone somewhere at this very moment is taking a Linux distribution, and for god knows what reason, changing one damned thing about it supposedly to make it better, and releasing yet another distribution. And now the GNU toolshave yet one more quirk to take into consideration. Let's see, shall we accomodate it by making yet another "implicit rule", or maybe a "macro", or how about an "environment variable"? How about all 3 of them? Hey, it's free software so it's not like you have to pay 3x as much for 3 different ways of doing the exact same thing, plus a doc file that grows 3x as big in order to explain the 3 different ways to do the exact same thing. And wait, this is getting really hard to use what with all this stuff heaped on top of it. So let's make another utility that spits out a bash script that makes the data file needed by the first utility. And of course, let's not make this second utility UI based, with
graphical controls the programmer can click on to select his features, and enter filenames, and such. No, let's make this second utility text-based just like the first one, so that you need to learn yet another "macro/shell/syntax language" and have even more data files to keep track of. And of course, as more and more deviations happen in the Linux world, we'll add stuff to this second utility too.

Finally, someone gets the idea to make things "easier" than those tools. What does he do? He makes another text-based utility of course, with yet another macro/shell/syntax language that tries to cover the exact same ground as the first set of tools. Yep, this one is going to be all things to all people too, but somehow it's going to be better. Now we have two tool sets that essentially do the same thing, and even in essentially the same way, but they're different for the sake of being different. But wait. Now we need to transfer data between the two of them, so we need to make a utility to do that. And no, let's not make just one such utility. Let's divide our resources making dozens of them,of course all of them command line tools (do Linux compiler writers even know what a GUI is, and how event-driven programming works?) each one doing essentially the same job, but in an arbitrarily different way that will somehow mean that support personnel will have to deal with yet more quirks/inconsistencies/variations on the exact same theme.

Madness. Sheer madness.

You've got autoscan, which generates a text-based configure.scan file. Then, you've got automake, which takes a text-based makefile.am file and generates one or more text-based makefile.in files. Then, you've got autoconf, which takes a text-based makefile.in, and text-basedconfigure.am file, and generates a very convoluted, text-based configure file. Then, the configure file generates a whole boatload of other text-based files. And each text file is filled with lines written in its very own, exclusive Martian dialect, with no set order in which information appears.

You're an intelligent being capable of easily reading text and applying deductive reasoning to it. A computer can crunch numbers fast, but it's not so good at deductive reasoning and making complex choices by analyzing freeform text. If you can barely make sense of the above tools and their output, what luck does an IDE have?

The problem isn't necessarily that an IDE is riding atop of text-based tools. Microsoft's Visual Studio does that too. The problem is that an IDE is riding atop of a whole mess of text-based tools that churn out way too much information, in a format that is way too free-flowing, poorly structured, and involves way too many "script/macro languages" that are totally unrelated to the language your app is developed in, all for the sake of trying to be all things to all people from the ancient past (in computer terms), to now, to the day that the last Linux guy ever forks another package and therefore causes another inconsistency to accomodate.

These dev tools are so complicated that even the IDE that tries to make them easier, is itself convoluted, complex, unpredictable, unstable, and ultimately unintuitive.

For god's sake, will some competent programmer please fork gcc, and fix this thing so that it actually is capable of _integrating_ with an IDE? Fix it so that it doesn't require the IDE to use "auto tools" that spew heaps of incomprehensive script files written in a variety of Martian and Venusian languages. Fix it so that when the IDE and these auto tools invariably "break" your project, you don't have to waste hours trying to hand-edit all those Martian-language data files, to get them to work again, until you finally throw up your hands in frustration and nuke the entire directory of unusable poot and restart from scratch (with Visual Studio and Microsoft's compiler instead).

Wednesday, November 5, 2008

It's the Applications, Stupid!

Here is my first user-submitted rant. This rant actually appeared as a comment on ESR's blog long ago, but I just got the permission to post it. Here it is:

It doesn’t matter. None of this matters. Platforms don’t matter. It’s what’s on the platforms that matter.

This is a lesson which I forgot when I went on a gaming sabbatical (coinciding with my exploration of Linux and OSS), but since I got back into the ol’ past-time it leapt back into my brain with the force of an epiphany. In fact, this is something which any gamer knows, though perhaps just implicitly. And any Sega fan, such as myself, has had their face rubbed in the fact to an extent which is painful. The story goes something like this:

A Prelude to War

When the Sega Genesis first came out in 1988, it faced quite an uphill battle against the entrenched NES, which had managed to become practically synonymous with the word “videogame” after it’s 1985 release. Indeed, to this day, many people say “nintendo” when they mean “videogame,” just as many people say “xerox” when they mean “photocopy.” The Genesis was certainly more powerful — it’s primary processor was 7 times as powerful as the NES’ — but power does not conjure good games out of thin air. And without good games, a console is no more than a paperweight. Sega’s previous console, the Master System, was 3 times as powerful as the NES, but since it’s 1986 release, it only sold 13 million units to Nintendo’s 60 million, simply because it didn’t offer a compelling library of games. Sega learned from this mistake, if only once.

When they launched the Genesis, courtship of third parties was intense. They were willing to offer developers better licensing terms than Nintendo, who was enjoying monopoly status at the time, and managed to do what, at the time, was the unthinkable: they fought Nintendo, and in the US market at least, they won. In large part, this was due to Sega taking a chance on an unknown startup that was desperate for a platform for their football game. Nintendo simply wouldn’t offer the little corporation terms it could survive on, and besides, the NES was ill suited to doing sports games justice. That little company was EA, and the game was Madden. Both became smashing successes.

With the help of this and other games, including some in-house smash titles such as the Sonic the Hedgehog franchise, Sega exploding onto the scene to history altering effect. To put it into perspective, the success Sega experienced would be like Apple gaining 50% marketshare upon the release of OSX. Even more mindblowing, this growth was coming at the *expense* of Nintendo’s installed base. By this I mean that old Nintendo users were abandoning the NES platform and buying Sega systems in droves. Though Sega’s hyper-clever marketing probably didn’t hurt (slogans such as “Sega does what Nintendon’t” still make the ears of any elder gamer perk up), it was the plethora of games that were only playable on the Genesis which produced this success.

It’s On like Donkey Kong

After three years of hemorrhaging market share, Nintendo fought back with the technically superior (save for processing speed) SNES in 1991. And while the SNES did absolutely everything correctly, and has rightfully earned it’s place of high regard in the annals of gaming, it completely and utterly failed to unseat the Genesis. In Japan it’s marketshare ended up exceeding Sega’s, but in the US it lagged, and Sega enjoyed reigning champion status in other parts of the world.

This was the dawn of the “console wars” as we know them today, and the 16-bit era is still regarded by some (likely through nostalgia tinted glasses, but hey, we’re only human) as the halcyon era of gaming. For every top-notch exclusive game that the SNES had, the Genesis had one as well. And so long as the game libraries of both platforms looked equally compelling in the eyes of the consumer, the entities were mostly locked in a dead heat. But time always marches on.

A Taste of Things to Come

It had been half a decade since a new system was released, and consumers were ready for the next generation. The arcades were taking business from the console market, offering an innovative and immersive gaming experience that the now underpowered 16-bit consoles couldn’t match. (Incidentally, Sega has been and still is a leader in the Arcade market.) The time was ripe for Something New — sadly, both Sega and Nintendo seemed to have forgotten the lessons they had learned from their battles with each other, a mistake which ultimately proved fatal to the former.

It all started in 1988, the year of the Genesis’ release. At that time, games were provided on a solid state medium known as a cartridge, which offered fast access as a benefit, but provided very limited capacity, and cost quite a bit to manufacture. Nintendo had been looking at a way to address these shortcomings by moving to a cheap, high-capacity disk-based medium. However, Nintendo was not able to satisfactorily surmount the stability problem of magnetic media, nor the concomitant ease of piracy. But Sony had just the ticket, since they were working on a
then-revolutionary technology which would allow them to store data on CDs, which were currently restricted to just audio.

So it was that Nintendo contracted Sony to develop a CD based add-on system for them. And in 1991, they were expected to announce the new designs at the yearly CES expo — but when Nintendo president Yamauchi discovered that the contact with Sony would give the latter 25% of all profits off the system, he broke arrangements with them
in a fury. Instead, Nintendo contracted with Philips to perform the same task, but with a contract that gave Nintendo full control of the system. It was this partnership that was announced at CES, much to Sony’s chagrin.

Ultimately, the Philips peripheral never materialized. But Sony refused to throw out their work. They spent years retooling the foundation into a 32bit console called the Playstation, and, determined to swallow Nintendo’s marketshare whole (hell hath no fury like a multi-billion dollar Japanese corporation spurned), they aggresively pursued third party developers, and launched an ad campaign that was arguably more Sega than Sega in its edginess.

But I’m getting ahead of myself.

No Cigar, Not Even Close

Back in 1991, Sega was releasing it’s own CD based add-on to the Genesis, aptly named the Sega CD. It was quite the technological breakthrough, but it didn’t come cheap. And as has been established previously, a platform is only as good as the games on it: in the case of the Sega CD, this amounted to a big pile of suck. They even managed to create a Sonic game for the console that was, in effect if not intent, a turd with peanuts. Only 17% of Genesis owners ever bought a Sega CD — not a one of them doesn’t regret it.

Then, in 1994, Sega blundered again with the release of the 32x — a $170 add on which would turn the Genesis into a fully fledged 32 bit system. With the 32bit era imminent, the idea of gaining access to the future on the (relative) cheap was immensely appealing to many gamers. The console was pre-ordered on a scale of millions, but Sega completely dropped the ball. In a dash to make it to the holiday season, games developed for the platform were rushed, and many of them curtailed (the version of Doom found on the 32x has half of the levels of its PC version). The system was one of the biggest letdowns in gaming history (next to the completely unremarkable Nintendo Virtual Boy — a portable gaming system which failed to be either portable or provide entertaining games). This was the beginning of what would become an insurmountably bad rep for Sega hardware.

Don’t Tell me You’re Pissed, Man

In 1995, Sega then released it’s true 32bit console, the Saturn. They released it a few months ahead of Sony’s Playstation, and actually enjoyed an upper hand in the marketplace at first. Sony did not fight against Sega the way they did against Nintendo, having no vendetta to settle. But unfortunately, Sega begat its own undoing. For the release of the Saturn, with its quality games and good 3rd party support, was seen as a sign of abandonment of the 32x — largely because it was, in fact, an abandonment of the 32x. Almost over night, legions of Sega fans became distrustful of the company.

Completely unwittingly, Sony managed to swallow up Sega’s marketshare simply by not being Sega — and, therefore, appearing less likely to screw the gamer. The Playstation pulled far ahead of the Saturn, and Sega never made any real effort to combat this very real threat to their dominance — the hubristic assumption was that Sony was not a gaming company, and therefore couldn’t win. However, the larger market share made the Playstation (or PSX) more appealing to third party developers. And although the Saturn was a little bit more powerful, the Playstation was vastly easier to develop for.

The result was that third party support for the PSX outstripped that of the Saturn by an order of magnitude. A lack of quality games results in a dead system, and in practice, a lack of third party developers is the same thing. The death blow for the Saturn came when EA, a monolith in the world of gaming which owed its existence to Sega (and vice versa), jumped ship and declared the PSX as its primary platform. Quite ironically, the Saturn was now doomed. And although Sega’s next console, the Dreamcast, was perfection in nearly every sense of the word, and the first console to provide online gaming, Sega never effectively garnered the third party support necessary to survive. In march of 2001, Sega exited the console market.

I See you Baby

Flashback to 1996, and Nintendo is bypassing the 32bit generation entirely to release it’s N64, technically superior to anything at it’s time (although some people were and are turned off by its distinctively aggressive hardware anti-aliasing). Coming out behind the PSX, and still being cartridge based, it couldn’t quite capture third party support the way the PSX did, but it managed to snag a marketshare equivalent to 1/3 that of Sony’s.

While Sony failed to slay Nintendo, the combined blows dealt to it by Sega and Sony demolished its monopoly position. There’s a lesson here that anti-capitalists could learn about the nature of free markets, if they happened to actually be interested in the truth — but that is neither here nor there.

What kept Nintendo alive was it’s stable of quality in-house games. Super Mario 64 is still regarded by many as the best 3D platforming game of all time, and Goldeneye stands unrivaled as the most playable and enjoyable adaptation of a movie ever. By contrast, Sega never had a proper Sonic game for the Saturn (apart from the lame isometric platformer Sonic 3D Blast, and the sucky racer Sonic R). Once again, the lesson is that quality games are the secret to a gaming platform’s success.

And so it is with the modern era. The Playstation 2 (PS2), Sony’s successor to the immensely successful PSX, rode the coattails of its predecessor to it’s currently unrivaled installed base of more than 100 million systems, giving it around 60% market share. The remaining 40% is split between Microsoft’s XBOX console (surviving because of exclusive titles such as the Halo franchise) and Nintendo’s Gamecube (once again surviving off of excellent in-house games, although now at the bottom of the totem pole in terms of market share).

So has it always been. And so shall it always be.

They’re Like Mopeds…

A lot of you have probably read this paper, called Worse is Better:

http://www.jwz.org/doc/worse-is-better.html

(If you haven’t, considering doing so.) Equally likely, you’re seeing a connection. Indeed, it would seem the ramifications of Worse is Better are incredibly far reaching, although I think the more general and correct statement is the following:

Technical merits are usually a lot less important than you might think.

Or, as I’ve said previously, a platform is only as good as what’s on it. A console is only as good as its games, just as a data medium is only as good as its ubiquity, just as an operating system is only as good as its applications. Empirically speaking, the technical merits of a platform seem to be a marginal factor (at best) in determining how it gets to a position of application dominance.

What this means is that when debating the merits and demerits of OSS vis-a-vis closed source in terms of potential for success, where success is defined as market share, it is generally pointless to bring up technical points. Windows is not popular because of Windows, it is popular because of everything that runs on Windows. Contrary to the original article’s opinion, Microsoft is absolutely correct to maintain backwards compatibility, because the totality of what runs on Windows is the “secret” to it’s success. Apple’s policy may be technically superior, but it hasn’t helped it get anywhere near posing a challenge to MS.

So Linux and Apple have faster releases than Microsoft? Big whompin’ deal. The debate over which system is better, or progressing more rapidly, simply does not matter. What matters is what people can do with the system, and for the desktop things most people want to do, Windows crushes all. In fact, if you look at OSS itself as a platform, than it’s an objective failure in the desktop market if the goal is replacing proprietary software. How good OSS is at producing quality software matters a lot less than how good it is at attracting software producers, and in that regard, it would seem to suck. There is a large range of computer oriented tasks that you simply *cannot* perform on Linux. And until OSS produces a game better than BZflag, it should be a self-evident fact that not only is not a silver bullet, it might barely be an arrow.

I Don’t Have the Answer, but I Know who Doesn’t

I use Windows, Linux, and Mac on a regular basis — I like Linux the system the most, followed by Windows, followed by the Mac (sorry, but I think the GUI is a weapon of mass gayness). But I actually spend most of my time in Windows simply because of the things I can do in it that I can’t do with the alternatives, or that I can’t do as cheaply, or that I can’t do as well, or some combination of all three. Microsoft has done an extremely good job of attracting the people who actually make a system worth using to their platform, and as a result, it fits practically every users needs. Hence its market share.

Of course, things change when you go to the backend, and sure, that’s partly because the requirements are different. But regardless, people don’t just put Linux on the web — they put Apache on the web. Or vsftpd. Or whatever. The fact that Linux has these highly sought things is what really makes it a success. The fact that these things offer the most generally popular price/performance ratio is why they are highly sought. The fact that OSS seems to be good at attracting developers of such things is why they are OSS. But it *doesn’t* mean that, even
if OSS is an inherently technically superior development model (and in the future I’ll make the case that that’s bullshit), it is destined to dominance. Reality is much, much, much more complicated than that.

Postscript

On an unrelated note, the GNU people can suck my cock. I don’t even want to think about the time I wasted drinking your koolaid. I hope Emacs becomes a sentient entity and bites every single one of you on your GNU/scrotum. And fuck VI too.

Tuesday, November 4, 2008

Splash Goes the Turd

Alright, one of you lusers sent in an article about Splashtop, so let's discuss that for a bit. What can I say? It is a great solution to the wrong problem. If you are using a memory-constrained environment, like an embedded system, then it might have some uses, but as an Instant-On technique, it will suck.

You see, my two year old Macbook already has an Instant-On technique. When I open the lid, I see my desktop in 2 seconds or less. That probably beats Splashtop. Also, I can then access all my applications rather than a restricted subset. What is this amazing technique, you may wonder? It is simple: ACPI! You see, when you have a frequently used computer with a stable operating system and ACPI support, you can go a long time without rebooting. Basically, the only limits are hardware failures and your need to install security updates. In this case, the boot time does not matter. Sure, it drains a little power, but if you use it more than once a week for a long amount of time, you  will have to carry a power cord around anyway. Ultimately, working ACPI support would be a much more viable 'instant-on' technique than Splashtop.

What about viruses? Alright, what about them? Sure, if you use Splashtop, it might be harder for malware to maliciously modify your computer, but if a remote vulnerability or privilege escalation vulnerability is discovered in your software, that vulnerability will remain viable for a long time. There is a tradeoff.

Also, how many people actually use these media-BIOS things anyway? My mother's Dell laptop comes with 'Dell Media Direct', and it has been nothing but a nuisance to her; she sometimes accidentally boots into it and fiddles with it for a few minutes to return to Vista. If Splashtop is used in this fashion, then millions will hate Linux just as much as we do.

Ahhh, the smell of victory!

Monday, November 3, 2008

Rants and Laughs 5

Okay, the Unix Hater's Handbook Review is coming along at about the speed of the X11 DRI fix, so I will take some time and see what is on Linux Reddit.

  • Linux Just Works with one fucking printer. Wow! Linux has finally beaten OSX! Mac Bigots of the world, you better switch right away! There is a new Just Working OS in town! Remember kids, Apple is a systems company, and this means OS X it has a smaller range of working hardware than Windows or Linux. This also means that the supported items often work much better than they do in other systems. Why don't you try one of these printers and get back to me on Just Working.
  1. Aigo Mobile Internet Device: Wow, a PDA! Who uses those anymore?
  2. Nokia N810 Internet Tablet: Again, who cares about PDAs anymore? Ever heard of a smartphone?
  3. Asus Eee Box: It also runs Windows XP. Linux power!
  4. Asus Eee PC:It also runs XP and has Linux problems.
  5. OpenMoko FreeRunner: *snort* with Englightenment technology!
  6. Motorola Ming A1600: Okay, this looks kind of cool. Is it one of those Android phones, though?
  7. Archos 605 Wifi: an excellent choice for an Apple averse media horder. How good of a choice is it for people who do not care?
  8. Mvix MX-760HD Media Center: a crappy Apple TV ripoff
  9. Sonos Digital Music System: Now, you can stream your music wirelessly all around your home for only $1000 DOLLARS! Wow!
  10. Garmin Nuvi880: Okay, I have heard it is a good GPS.
So, out of this list, I see only TWO items that look kind of cool and do not also run Windows better! This is the best you freetards can do? I hang my head.
  • I saved the best for last. Here is an article calling on the community to restrain itself from criticizing developers when they screw up. It looks like I am going to have to focus on this one.
Well, it starts off in the expected way.
You can see this growing viciousness in the hostile reaction to KDE last spring, or in sites like the just-defunct Linux Hater's Blog, as well as the articles of professional and semi-professional journalists who demonize anyone who fails to agree with them completely.
What, you mean like you just did?

Aaron Seigo of KDE described the problem the other month in his blog:

Every so often someone with a real crank on will start following me around the intrawebs posting their hallowed viewpoint on me. It seems to happen to everyone with an even moderately public profile. Usually they get stuck on one message and then post it consistently everywhere they can as some sort of therapeutic outpouring of their inner angst. Most people don't last more than a couple weeks at this, though I've had a couple of people with real commitment dog me for a year or more.
Seigo admits that, being visible, vocal, and outspoken, he makes an easy target. It's not that he objects to views he doesn't agree with, he says, but that "I don't have time for pointlessness."
So Seigo is mad because a bunch of KDE users complained about his broken release and his complete disregard for the needs of his users, and he thinks that such criticism is pointless. Wow, that is rich! I wish I was that rich!

Such attacks are abusing the freewheeling freedom of expression that is the norm in FOSS. By refusing to temper this freedom with responsibility, those who make them are seriously handicapping the community that they claim to represent.

How about developers first start to take responsibility for their actions and not regularly screw over their users! Then we will talk about civility!
But why such attacks are becoming so prevalent in FOSS is harder to explain. Perhaps their origins are part of the worldwide fallout from the unusually heated and prolonged American presidential campaign, in which attack ads and ad hominem attacks have become the norm.
What . . . . the . . . . fuck!!!!
Or perhaps relative newcomers to FOSS are taking out their frustrations with unresponsive proprietary companies on prominent members of the community. Unlike company executives, FOSS developers and maintainers are accessible, so they get the suppressed anger that should be aimed at the executives.
What
the
fuck!!!!
Even more likely, as one of the earliest and most Web-integrated communities in existence, FOSS has become a center of such attacks because of the strange combination of intimacy and distance that is peculiar to the Internet
Okay, that does sound remotely plausible (in comparison). The anonymity of the Internet has been known to increase vitriol but so has releasing shitty software!
At times, too, the uneasy alliance between free software and open source advocates erupts into verbal battles.
Maybe, the problem is that the 'community' is composed of autistic fosstards? In that case, any civility is more than can be expected.
Perhaps newcomers are simply adopting the rhetoric they believe will make them fit in.
Yeah, or maybe they are reacting to getting screwed over by egomaniacal developers after having invested days of their time learning Linux?
Since then other projects, such as KDE, have borrowed heavily from the codes to produce their own versions. A community-based code would need few modifications to be just as effective.
How about a code that says "DO NOT FUCK OVER YOUR USERS!!!!"?

Link

Saturday, November 1, 2008

More Oldies

I am currently doing my own review of the Unix Haters Handbook, and it is probably going to be long, so posting might be light this weekend. Anyway, here are a few good internet gems.



Friday, October 31, 2008

Rants and Laughs 4

Once again, I present the daily Rants & Laughs section. 


Finally, here is my first user submission. Thanks, thepld!

http://linux.slashdot.org/article.pl?sid=08/10/31/160242

"Ubuntu 8.10 Outperforms Windows Vista "

Nevermind that the piece of shit can't do half the things that Vista
can...but hey, it boots 3 seconds faster! And all that performance must
be great for all that awesome Linux gaming? I bet you can get twice the
framerate in all five versions of Tux Racer than you did before!

Open S{ource,hakedown}

Here is a fascinating article that I meant to post when I started this site, but it slipped my mind. Read it and notice the similarities between open source crusaders and, uh, less savory characters. It talks about the steps that you should take in order to easily comply with the GPL without bringing down the fosstard wrath upon your company. For those of you who are too lazy to read the article, here is the meat.

... but getting sued is not the real problem. The real problem is when a posting about misappropriation of GPL software shows up on Slashdot and LWN. The real problem is when every public-facing phone number and email address for your company becomes swamped by legions of Linux fans demanding to know when you will provide the source code. The real problem persists for years after the event, when Google searches for the name of your products turn up links about GPL violations coupled with ill-informed but damaging rants.

So we want to avoid that outcome. If you read the legal complaints filed by the Software Freedom Law Center, they follow a similar pattern:

  1. Someone discovers a product which incorporates GPL code such as busybox, but cannot find the source code on the company web site (probably because the company hasn't posted it).
  2. This person sends a request for the source code to an address they find on that website, possibly support@mycompany.com.
  3. This request is completely ignored or receives an unsatisfactory response.
  4. The person contacts SFLC, who sends a letter to the legal department of the infringing company demanding compliance with the license and that steps be taken to ensure no future infringements take place.
  5. SFLC also demands compensation for their legal expenses; thats how they fund their operation.
  6. The corporate legal team, misreading the complaint as a shakedown attempt, stonewalls the whole thing or offers some steps but refuses to pay legal costs.
  7. Lawsuit is filed, and the PR nightmare begins in earnest.

Now, IANAL, but I cannot imagine why a company would interpret this earnest plea for sourcecode and money as a shakedown attempt! Wait, maybe I can guess. Is it because it sounds EXACTLY THE SAME AS A SHAKEDOWN ATTEMPT!!! Now, I have no experience in the mau-mauing . . . er legal business, but if your ultimate goal is to promote FLOSS, then maybe you need a better business model. How expensive can it be to draft a legal form letter or letters and mail it off to companies' legal departments? SFLC, if you need money, tell rms to stop being a cheapskate and provide you with it. You could also ask major open source companies (IBM, Red Hat, etc.) for material aid. Demanding $Megabucks from companies to pay for your printing costs is not doing the community's image any favors. However, the best quote is at the end.
In practice the advertising clause (LHR NOTE: He is talking about 4-clause BSD) results in a long appendix in the product documentation listing all of the various contributors. Honestly nobody will ever read that appendix, but nonetheless it is worth putting together. You can also include a notice that the GPL code is available for download from the following URL... so if despite your best efforts the company does get sued, you'll have something concrete to point to in defense.
So your company might face a damaging lawsuit and a PR shitstorm because Freddy Freetard did not RTFM. Wow, just wow! To all companies out there considering using FLOSS, you may want to look at alternatives. There are very good quality proprietary systems out there, like QNX. If you need a decent, gratis (i.e. zero cost) operating system, you might want to look at the *BSDs. "Free Software" is more trouble than its worth.

Thursday, October 30, 2008

Rants and Laughs 3

Once again, I present "Rants & Laughs", or, as it should be called, the best of Linux Reddit.

  • Here is an article detailing the hackishness of initrd. What?! Ugly hacks in Linux?! Say it ain't so, Pa; say it ain't so?
  • How has Linux surprised you? After 15 years of development, it still has not achieved 1% marketshare.
  • Open Source Makes New Inroads in Asia and Sardinia. Wow! Linux has gotten a bunch of government bureaucrats to issue a formal statement! Linux is truly on the cusp of world domination now!
  • Here is a handy-dandy guide on how to make money with open source software. Condensed version: get the European welfare-state to help you out.
  • A man gives Linux to his unsuspecting fourteen-year-old daughter. I am sure she is really thrilled to have an operating system that is difficult and annoying to use. Someone should call Child Protective Services.
  • HOWTO use your Linux box as a Media Center. My father recently purchased a new PC that could do this. Here is the guide to get it working.
    Step 1: Buy PC
    Step 2: Hook up PC to AC outlet
    Step 3: Hook up cable to back of PC
    Step 4: Turn on PC
    Step 5: Open Media Center
    Windows: 2
    Linux: 0
    Thanks for playing!

Wednesday, October 29, 2008

More W{,h}ine

In my last post, I discussed the criticisms made by a Wine supporter. However, there was one criticism that I did not address. Apparently, you can download older releases. Well, I will try one out now. Since I do not want to compile my Wine, I would like a precompiled binary. Let's see what I can find.

I now present my handy-dandy guide to Wine installation on Ubuntu!


Alright, first we will navigate to http://www.winehq.org/.




Now, we will look at the page to see where we can download Wine binaries. Then, we click on "Get Wine Now" in the "Download" panel on the left-hand side of the screen.



Now, we arrive at the download page. We will look through all the offered binaries to find which one we want.



Since Ubuntu is the distribution I use, we will click on that one.



Now, we are at the download page for Ubuntu binaries. Everything looks good, right.



Hmmm... Let's read the fine print.

Warning: These are beta packages

The packages here are beta packages. This means they will periodically suffer from regressions, and as a result an update may break functionality in Wine. If the latest stable release of Wine (currently Wine 1.0.1) works for you, then you may not want to use these beta packages.

Hmm... These packages must be the kissing cousin of the one that broke audio in Fallout 2. Well, I guess I could try them if I really had to, but let's see what else is available.

At the very bottom of the page, we find something interesting.

Older .deb packages

Since the APT repository can only hold the latest packages, older versions of the packages are available at the WineHQ .deb packages archive.

You can install downloaded packages by double-clicking on them.

Well now, apparently you can install older, more stable versions. They just mention them only at the very bottom because . . .

Now, click on the link http://wine.budgetdedicated.com/archive/index.html.

As of (2008-10-29-08-48 EST), you will see the following page.

Well, that's no good! I guess we will have to try the beta packages after all!

After scrolling to the top of the page, we will read the directions.

Open the Software Sources menu by going to System->Administration->Software Sources. Then select the Third Party Software tab and click Add.

Administration->Software Sources->Third Party Software" border="0">

Then, copy and paste one of the lines below depending on which version you are running.

For Ubuntu Intrepid (8.10):
deb http://wine.budgetdedicated.com/apt intrepid main #WineHQ - Ubuntu 8.10 "Intrepid Ibex"

For Ubuntu Hardy (8.04):
deb http://wine.budgetdedicated.com/apt hardy main #WineHQ - Ubuntu 8.04 "Hardy Heron"

For Debian Lenny (5.0):
deb http://wine.budgetdedicated.com/apt lenny main #WineHQ - Debian 5.0 "Lenny"

Well, that looks easy enough! Since I am using 8.04, our entry should look like this.



Swell, now that should do it, but not quite. Let's read more instructions.

After adding the repository, you also need to add the key for the repository to your system's list of trusted keys.

Download and save Scott Ritchie's key to your desktop. Then open the Authenticationimport key file, and select the key file you just saved (Scott Ritchie.gpg). It is safe to delete this file after doing this step. tab, click

Administration->Software Sources->Authentication" border="0">

Click close to finish, and then reload the package information. If you have Wine installed, the system's update manager will now inform you of the latest Wine beta release and prompt you to upgrade. If you haven't installed Wine yet, go to Applications->Add/Remove and search for Wine.


So, we have to download a GPG key, eh? Okay, let's download it the key http://wine.budgetdedicated.com/apt/Scott%20Ritchie.gpg. As of (2008-10-29-09-06 EST), you should see the following:

$ wget http://wine.budgetdedicated.com/apt/Scott%20Ritchie.gpg
--21:07:16-- http://wine.budgetdedicated.com/apt/Scott%20Ritchie.gpg
=> `Scott Ritchie.gpg'
Resolving wine.budgetdedicated.com... 81.171.111.184, 81.171.111.184
Connecting to wine.budgetdedicated.com|81.171.111.184|:80... connected.
HTTP request sent, awaiting response... Read error (Connection reset by peer) in headers.
Retrying.

--21:10:34-- http://wine.budgetdedicated.com/apt/Scott%20Ritchie.gpg
(try: 2) => `Scott Ritchie.gpg'
Connecting to wine.budgetdedicated.com|81.171.111.184|:80...
Repeat ad infinitum

Well, now that is no good! Maybe, APT can work without it. Let's find out. Click the "Close" button in the bottom-right corner of the "Software Sources" window. Then, click the "Reload" button in the menu that pops up.



Click on the "Close" button in the two windows to complete the installation.

That wasn't too hard, was it? Now, you should have a stable, fully functional Wine that is capable of running any Platinum-rated application you throw at it . . . . . or not.

EDIT: I changed the parentheses to curly-brackets to better reflect a list of possible matches in a regular expression. Comment if this is not correct.