The open source stubbornness at application level

opensource_compRecently I came across 2 instances where it seemed that open source application developers (much above the kernel and middleware) are playing it harder than they ought to.

1. The stop-root-login agenda of Ubuntu forums

If a user is blocked from understanding the full potential of the OS how will he ever learn? If you are so much concerned about data loss simply advice storing data in a volume other than / or to back it up separately if the user wants to play around with the system. Or a better advice would be to ask newbies to play around with the system for a month without storing important data in it (use an external hard disk to store data and unmount it while you are playing around). It’s necessary and sufficient. It’s unbelievable how Ubuntuforums is trying to stop users from root login. I keep my data backed-up is a separate volume. I let my wife who has an Arts background (and no knowledge of computers 3 years back) use the system with root access for years and I never faced any major issue. Windows allows Administrator logins, Windows sells and Windows users get support for issues. Whether the support is free or not depends on the policies of the OS. Supporting an user on ANY issue is the responsibility of the community itself and there are numerous people in the Linux community (and I believe even in Ubuntuforums) who are always ready to help with root-login issues, whether Ubuntuforums administrators advise it or not. The forum, if I understand correctly, is not only for learning system administration, it’s for learning the system.

2. Nautilus following 1k/Kb/B = 1000 bytes

I tried to reason with the application developer to at least provide a switch and I failed. The solution he provided was “change the source code and compile it yourself” which means (i) I have to install n number of libraries which I otherwise won’t even need and (ii) compile Nautilus every time the mainstream source changes, at least when major changes appear. IMHO, this is no solution at all. Logically – computers use binary number systems, not decimal. A bit can be either 1 or 0, that’s why the computations are based on binary/hex and not on decimal as it involves less overhead for the CPU. 1kb of storage space = 2^10 bits is the widely followed convention. Even Microsoft, the largest player in OS follows the same which means maximum users are accustomed to it. “ll -h” still shows the correct size and an application in the same system reports a different size. Users are supposed to understand the behaviour and learn to live with it. There are two very important aspects in any design – uniformity and flexibility. I wouldn’t have bothered had it just been pcmanfm, but nautilus is the default browser of the Ubuntu system – you can’t all of a sudden report a file size in bytes different from all major operating systems!

The threat to Linux now is – users don’t understand internals, they interact with applications and want the best user experience. We need a stronger set of architects for user-level applications too who can see beyond the immediately available developer solutions and decide on the basis of acceptability.