You are viewing cyberlizard


The CyberLizard

About Recent Entries

Gone Jun. 24th, 2005 @ 10:43 am
Since LiveJournal now supports tags, I don't want to have to login to another account to post tech items. Because of this, I'll be keeping this account active just for older entries, and will now post tech items on jmibanez

Ubuntu: Installation to Desktop Mar. 16th, 2005 @ 11:23 am
I'm a bit impatient waiting for my order of CDs to arrive from Canonical (I still want them, if only because the packaging is good and I prefer original CDs), so I asked my cousin to download and burn Hoary preview for me. Yesterday, I got the CD and last night I installed it on my relatively-ancient PC1.

I've installed four other distros before: Mandrake, Slackware, Redhat, and Debian (five, if you count Fedora as a separate distro). I can't give a comparison of all the distro installers as they are now though— the last time I installed Mandrake or Redhat was when I first started.


The first thing that impressed me was the boot help screens (accessible from pressing F1 through F10). Each screen had really useful information. For instance, there were two screens listing commonly known problematic hardware and the linux kernel options you had to pass to get them to work or be detected. There was a screen listing information about Ubuntu, as well as the four possible setups you could run (desktop or server installation, both either at the 'default' or 'expert' level). There was also a screen of minimum hardware requirements needed for Ubuntu.

I opted for a desktop installation, using the default level. Unique to my box, I had to append 'mem=64M' to the kernel due to some flaky RAM. I plan to get surplus EDO DIMMs sometime soon, if I have the time and the money. Anyway, I initially tried 'mem=32M' to see just how little RAM I could use to install. For some reason, the installer kept dying, and I checked console 4 which had syslog piped out— the binary was dying from lack of memory. Go figure. This might be fixed in the actual release though — mind you, I got the preview ISO, not release.

I didn't figure out how to set up swap on the fly during the install to make it install on a 32MiB box (too lazy). In any case, IMHO you should have more than 32MiB of RAM to install as painlessly as possible.

The bootup proceeded fine, detecting hardware necessary to begin. The first few prompts were similar to Debian Sarge's installer (same pedigree?). The prompts were for language and keyboard settings. (Interestingly enough, the installer does have a i18n setting for Tagalog — should try that one of these days). The cool thing about the keyboard setting prompt is that there was a menu option to detect your keyboard. I don't know how it'd go about it for other keyboards, but on my system I was asked a series of prompts where I had to press the key to one of several characters displayed. If you have a Dvorak keyboard and you didn't know that it was a Dvorak keyboard (maybe you're a newbie? ;), this is useful.

One thing I noticed during the installation was that it loaded a whole ton of modules. For example, instead of detecting what IDE driver to use for my system, it installed all IDE drivers. I don't have a USB bus on my box, but it installed all three USB host controller drivers (ohci, ehci, uhci).

Questions were asked about my timezone, etc. The installation proceeded quite similarly to Sarge, in terms of prompts and what not. What differed though is that the installer did not prompt me for packages to install — good for the newbie, bad for the control freak power user (who should use 'expert' instead, ya know? ;).

When the base system was set up, the installer copied a truckload of packages from the CD to /var2, and proceeded to inform me that the system had to reboot3. It took roughly less than an hour from start to first reboot.

At reboot, the installation proceeded by installing the packages copied to /var, which took about two hours. So, I proceeded to make myself a cup of tea4.

The installer finished the second part of the installation with some minor hitches5, but that was due to my box's specs.

One thing that should surprise a lot of people: on Ubuntu, you are not asked to provide a root password. In fact, the root account doesn't have a password (/etc/shadow has '*' in there) and can't be used to login directly. However, the first account that the installer prompted and created for you is added to the 'adm' group (among other groups), and /etc/sudoers allows 'adm' to run sudo as root for all commands. Which means you can simply 'sudo su -' to get a root shell. (I believe most of the GUI tools that need root are configured to launch gksudo instead of gksu). If you really want a root password, you can always 'sudo passwd' and type in a new password.

I decided that I wanted to grab some packages online — which presented me with a hitch. I have a really crappy winmodem6, which means that I had to boot to a 2.4.x kernel I already configured with the binary (sucks). So, I rebooted to Slackware, fixed LILO (and did some fixing, unifying my Slackware and Ubuntu /boot directories to a single partition as well as copying lilo.conf to /boot and adding a symlink in /etc, so I can LILO from both distros), and crossed my fingers. Thankfully, the system booted up, and I had my custom 2.4.x kernel booting my Ubuntu installation7.

Dialing out using the out-of-the-box installation means you have to configure wvdial, the dialler included with the system. (Later, I was thinking of getting kppp so I could share my kppprc between the two— decided against it as I had to grab a lot over the dialup link. Not a feasible idea. KDE and related apps aren't included with the CD, but are available for apt-getting). Bumped into some snags — wvdialconf, the autoconfigurator, refused to acknowledge my /dev/modem symlink to the SM56 device /dev/sm56, so I had to hand-hack /etc/wvdial.conf; the modem apparently was also fudging up on the CD (carrier detect) line and was confusing wvdial on dialup— had to switch carrier checking off.

When I went to edit /etc/apt/sources.list using vi, I was surprised to see several commented entries already there, pointing to the Ubuntu repositories. Double plus good for relative newbies — you can always tell them to simply uncomment those lines and do an update.

Anyway, I ran aptitude and grabbed several packages I needed — in particular, I grabbed Emacs (<evil grin>) and Ratpoison. I have been trying keyboard-heavy WMs recently; previously I tried IonWM, but found its management philosophy too complicated. Ratpoison is (unfortunately?) very very lightweight and very very simple to use8.

A note on the Ubuntu package repositories: Ubuntu apparently has two repositories in general: the supported packages repository and the 'universe' repository. Unlike Debian, there isn't a 'main' or 'contrib' per se; 'main' roughly maps to the supported packages, and 'contrib' roughly maps to the 'universe' packages. AFAICT, there aren't any nonfree packages.

Anyway, I got ratpoison set up properly, and am now running a similar desktop configuration as with my Slackware installation.

I did give the GNOME desktop a more-than-cursory glance though. Hoary has GNOME 2.10, IIRC, and the chrome is purty. I'll probably try it out one of these days, just for comparison's sake— my Slackware installation only has GNOME 2.4, and a semi-ept install at that.

The choice of Debian as a base, I think, is a big win for Ubuntu. The package management is great — the configuration and setup was painless, mostly due to the package management.

Although the Ubuntu package repositories aren't as extensive as Debian's (and that's quite an understatement), the packages are pretty much what most users will be downloading.

I do have some qualms though about the installation. I'm a minimalist (and masochist) at heart, and so found it a little distateful that several packages were installed by default. (Then again, I should have been playing around with the expert setup instead). For example, USB tools were installed even if I don't have USB; The Samba client was installed even if I don't have a network card. However, in all fairness, the packages that were installed are not enough bloat for me to really ditch the distro. <flame>Mandrake has it far worse, IMHO, and both it and RedHat are the worst bloat offenders.</flame>.

I heartily suggest you give it a try. Now if only the CDs would arrive soon... so I can critique the packaging. ;)

  1. Specs at I didn't wipe out the Slackware installation, mind you. I still had about 10GB unpartitioned space free beforehand.

  2. If you have /var set up in a separate partition, make sure you have around 300-400MiB for it.

  3. Since I had Slackware already installed on the box and I was using XFS for my filesystems, the installer opted not to use GRUB and I opted not to install LILO on the MBR; however when I tried installing to the partition boot record it simply died. I fixed that later on.

  4. I bought a box of blackcurrant flavored black tea (Twinnings)— really good. I suggest you try it.

  5. The config was borked; the setup script detected my onboard SiS 5597 (which was disabled at the BIOS) instead of my S3 Savage4. Manually ran xorgconfig to fix it.

  6. Motorola SM56, here for a kludged driver, or see here

  7. I had to recreate some nodes in /dev for my Ubuntu root though, as Ubuntu has udev doing the dirty work for /dev.

  8. Think of it as screen(1) for X11.

Jan. 3rd, 2005 @ 07:24 pm
You get all sorts of stuff from browsing SlashDot. Of course, I do think most people who read this blog already know about some of the stuff there...

Anyway, I bumped into a nice and interesting collection of puzzles (which may come in handy for some small companies looking for tech interview questions and brainteasers.) Fun stuff.

Cut With The Grain May. 24th, 2004 @ 02:10 am

I've been recently tasked to design part of an accounting system for one of our clients. Implementation-wise, I'm working with the Microsoft .NET framework (C# being the preferred language) using ASP.NET.

One part of the system involved an editor component that had several pages, with each page being a separate step that was needed to be done. Each page had its own set of web controls and form fields, and each page had separate logic for processing the passed data. So, the ASP.NET Page Controller mechanism seemed quite natural.

However, I wanted to centralize some common controller accounting (i.e. shared state and such), and wanted to enforce restrictions, and not have the user access the editor page stages directly; the current step for the record being edited was kept in the DB anyway, for reasons important to the design and the business rules.

Naturally, I thought of doing it as a Front Controller in ASP.NET; I would then transfer control to the view behind the scenes to render the page and to have that page controller do the necessary logic.

My mistake. I was not cutting with the grain. ASP.NET encourages you to use the Page Controller design, what with code behind classes and all. On the other hand, being a Java programmer, I got used to the idea of using RequestDispatcher.forward() from a controller servlet to forward to a view JSP or what not, and have the name and location of the view JSP hidden from the user. Apparently, when using ASP.NET web server controls (System.Web.UI.WebControls), generated post back events go to the absolute URL of the .aspx file that generated them; since post back depends on the <form> tag, ASP.NET immediately sets the action attribute to point to the .aspx of the actual page that hosts the control.


SQL to generate frequency counts May. 13th, 2004 @ 11:17 pm

Question: Say you had the following tables:


How do you generate frequency counts grouped by status, in such a way that the frequency counts are columns of a row with the group_id, given three status codes (0 == ok, 1 == fail, 2 == pending)? That is, how do you get to the following table:


The following SQL should do the trick:

SELECT g.Group, ok_count, fail_count, pending_count
FROM Group g
   SELECT group_id, COUNT(status) as ok_count
   FROM Item
   WHERE status = 0
   GROUP BY status, group_id
) ok
ON g.group_id = ok.group_id
   SELECT group_id, COUNT(status) as fail_count
FROM Item WHERE status = 1 GROUP BY status, group_id ) fail ON g.group_id = fail.group_id LEFT JOIN ( SELECT group_id, COUNT(status) as pending_count FROM Item WHERE status = 2 GROUP BY status, group_id ) pending ON g.group_id = pending.group_id

Current Mood: accomplished
Other entries
» Hibernate anecdote

On one project I was assigned to recently, I used Hibernate to manage data access. I even explained the whole deal to my technical manager and my project manager— with some level of excitement. (Okay, maybe with a lot of excitement— think a kid in a candy store).

Anyway, there was one time when we had to demonstrate the working system to the client, and we fielded some concerns they had. Part of the deal was that we had to transfer the code to them, so they can make necessary modifications. Apparently, they were planning to change some parts of the database (and change it on their own), so they naturally inquired of the file(s) containing the SQL statements. These guys came from a PHP/scripting background, and they thought JSP was just a variant of sorts of PHP, for Java.

I definitely had a hard time explaining to them that the SQL was generated dynamically by Hibernate— I did point them to the mapping documents and the classes I used...

» CSS is the way to go

Was showing a cow orker some nifty CSS stuff yesterday. In fact, I was practically gushing over the subject. In particular, I was gushing over this site and the fact that all the designs are CSS-based, and are using the same HTML content.

I then realized that a lot of mainstream sites are still not completely CSSified. The company I'm working in isn't using CSS extensively for site design; we currently don't have the right skillset for CSS design. It pains me that there's a lot that can be accomplished with CSS (accessibility out of the box, leaner pages, easier to maintain site design, separation of concerns), but we're not taking advantage of it. All we're doing with CSS is specifying font size and colors. Bah.

It also makes me wonder— are most web development firms, at least locally, not CSS-ready? Is there a lack of CSS designers and professionals?

Bah. I'll stick to coding Java.

» (No Subject)
The one thing I like about Mozilla Firefox is the fact that I can open several sites in tabs, as part of my home page settings. <grin>. To use, simply separate the sites you want opened by the '|' (pipe) character, in the home page Location: text box (Tools > Options, General section)
» The Economics of Open Source

Some links I found recently, after a long and lively debate I had with a co-developer about the economics of open source software. I was for open source, and my colleague was for closed source. He basically was arguing that open source fundamentally can't be sold (i.e., there is no business there, and open source will die out). I was arguing that there is a business model there somewhere.

Looking back, after reading the links below, I admit that some of my arguments were flawed. We were both right, and both wrong. There is a business in open source software, but it's not necessarily in selling the end product.

It's about adding value. ESR points out that software is "largely a service industry operating under the persistent but unfounded delusion that it is a manufacturing industry," which I can agree with.


» Revisiting Java via C#
Lately, I've been working on a project in Microsoft Visual C#. The language itself closely resembles Java (especially in certain parts of the class library exposed by Microsoft .NET), but its heritage is more C++ than Java. Inheritance is via the same construct/syntax as C++, and both interfaces and classes are inherited this way, unlike in Java with its extends and implements keywords.

However, one thing that nags me about C# (and has caused me to rethink things in Java) is the fact that all exceptions are unchecked. (Offhand, someone on the web mentioned this while comparing Java and C#; I can't remember the URL though). This means that calling a method that may throw an exception (for example, ADO.NET stuff may throw SqlException) without catching the exception anywhere means that somewhere down the line, your application/system will die a violent death when some exception does occur. Which means that you have no idea where exactly to code your try..catch blocks, and if ever, end up catching Exception, just in case. Ugly.

This also means that you have to explicitly add documentation to your code to mention that you are throwing certain exceptions. (Reading through the code will not help, unless you intend to read several levels of files/classes deep).

And, grudgingly, I'll admit that checked exceptions are elegant and needed. Unchecked exceptions do remove some of the bother of having to write try..catch blocks, but they also mean that exceptions go undocumented. Certain exceptions are worth being unchecked, but not all.

Of course, since I am not yet completely a C# developer, I may have missed out some of the more obscure corners of the language.

(On a somewhat unrelated note, I found myself typing C$ instead of C# while writing this post. Interesting... hehehehe)
Top of Page Powered by