Posted on Jul 9, 2011

Type Malayalam on iOS

The KeyMagic folks are back with a bang again, making it work on jailbroken iOS devices. The Malayalam layouts are already part of the default KeyMagic installer, which means that you get the option to type in Malayalam if you just install KeyMagic.

Thant Thet has just come out with the iOS installer, and the Malayalam layout on iOS will have to tested by someone who owns a jailbroken iPhone. Volunteers, please.

Android should be the next target, I guess. Already, enough complaints have been raised that it is not possible to type Malayalam on Honeycomb. jKeyMagic, which should work on all Javascript supporting browsers might provide some solace till a proper solution is in place (Disclaimer: I have not tested jKeyMagic yet).

Posted on Mar 20, 2011

Type Malayalam on Mac OS X: Phonetically, Naturally

From the time I moved on the Mac, I have been rather upset with not having access to a good interface to type Malayalam on, though the reading problems were more or less solved by the excellent instructions on I really missed Mozhi Keyman, and Google transliteration was not a very good replacement, as text had to be copied from the browser every time.

So, I was excited to hear from Harold James of Workers Forum about the new transliteration service, KeyMagic, and the Malayalam phonetic keymap for Keymagic created by Junaid. Junaid has created Malayalam keymaps, for both the phonetic scheme Mozhi and the native keyboard layout Inscript, to work with Keymagic, and he has a packaged installer for Windows.

It was sufficient to add Junaid’s keymap files to the OSX IME package published by the Keymagic team, to make Malayalam typing possible on my Mac. The packaged app is available for download here.

To install, just follow the instructions below:

1. Copy to ~/Library/Input Methods.

2. Log out and log in again (or just restart).

3. Open System Preferences, go to Language and Text, and enable KeyMagic in the Input Sources tab.

4. Check the option Show input menu in the menu bar.

To type in Malayalam:

1. On the menu bar, click on the input menu and select KeyMagic as the current input source.

2. The input menu on the menu bar now shows the options Malayalam-Mozhi and Malayalam-Inscript. Select Malayalam-Mozhi for phonetic typing, and  Malayalam-Inscript for the Inscript keyboard layout (the one used by CDAC LEAP/GIST software).

I just tested this on Snow Leopard, but it should work on Leopard and Tiger as well, by following the installation instructions given along with the OS X installer on KeyMagic site (download).

അപ്പോ, തൊടങ്ങ്വല്ലേ  മാക്കിൽ മലയാളം ടൈപ്പിംഗ്?

Posted on Jan 30, 2011

Re-enable right click and copy-paste on OnlineSBI

I have been thoroughly frustrated by not being able to copy-paste payee account numbers and addresses into the SBI online banking website. Having to type everything by hand takes a long time, especially for a product of the cut-paste generation like me. On top of that, entering account numbers by hand is error-prone.

While I curse the feature, I can understand when my good friend @srijitm tells me that these features prevent people from inadvertently copying passwords (and other sensitive information) to the clipboard which can be accessed by someone else using the PC or through a hack.

So I drop plans to create a BeatOnlineSBI extension to help the world beat the evil designs of the bank.

But I am still frustrated, so I download and install the Anti-Disabler greasemonkey script (thanks to Mark Pilgrim) which re-enables the right click functionality. Tthis does not solve the cut-paste problem as the fields have oncopy and onpaste handlers set to false. A couple of lines added to the Anti-Disabler script does the trick:

if(e.getAttribute("oncopy")=="return false") e.setAttribute("oncopy", null);
if(e.getAttribute("onpaste")=="return false") e.setAttribute("onpaste", null);

Now I can freely copy-paste stuff in and out of the online banking site, as long as I use Firefox to access it. Making the script work on Safari and Chrome is for another day.

Posted on Jan 29, 2011

Why Twitter Not?

I am back on Twitter, more than 6 months after I made a conscious decision to keep away from it for a while.

Being one of the earliest users at Twitter, I did not really feel the need for it until I left Cisco, its vast network of brilliant people, and the constant exchange of ideas and information. The move was multiple relocations rolled into one – the relocation from the swarming Cisco cloud to a decrepit 2-man office; from the confined yet anonymous, responsibility-free existence of a nuclear family in Bangalore to the satisfying, distraction-filled joint family existence back home; from the cutting edge of networking technologies to bleeding at the edge of biotechnology without formal training; from the comfort of concentrated technical work to the distractions of administering a start-up. This coincided with Twitter’s bursting upon the global scene, the network effect finally taking it past that critical velocity below which new web startups live constantly resisting crashes and burn-outs. Cut-off from the IT network, Twitter was my saviour, my new link to my old world, which people kept calling the new world.

But Twitter was new territory. I was excited by the freedom of being able to write down thoughts at will, to be read by an actual audience that provides instant feedback. The 140-character limit was counter-intuitively liberating; the freedom from grammatical structure and increased opportunity to play with words brought in a freshness that was exhilarating. Twitter was the Liril of the Web.

The whole world stumbled on it, new opportunities and new pitfalls were discovered, sometimes by people of eminence who suddenly found themselves on slippery slopes. Though insulated by obscurity, I too discovered that Twitter was not the dawn of a whole new free unselfish world. I learned that it was very easy to shoot my mouth off, without assessing consequences. I learned that being frank in Twitter poses the same dangers brought about by naïve frankness in life. I learned that recording my every action and thought every day on an online platform opens me up as a target of analysis – Twitter is not really a diary, a tool for confessions. I learned that using Twitter with absolute frankness takes away my power to lie, an essential life-skill. I learned that my personal opinions on politics, society and culture, aired freely on Twitter builds a profile that might not fit with the image that my business wants me to project as its representative.

Yet I could not write without being frank – my writing, however limited it might be, has to reflect my opinions and values. Yes, it has to, as long as I can clearly mark it as personal writing unconnected with other entities that I represent. As long as I can be sure that what I write is my opinion, what I would like to be recorded and published as my opinion, not just a fleeting thought aired in a conversation.

Twitter is a powerful communications and networking tool. It is a great business application. But unless I remain anonymous, lines drawn in life shall extend to online life as well. For Second Life is also Life.

I am back on Twitter.

Posted on Apr 8, 2006

PDF creation tools

I love PDF documents. They give excellent print reproduction and are very convenient to distribute.

I have been using a free PDF writing tool called PDF995 for converting documents to PDF. It installs as a printer and can be selected from the Print dialog. It then asks for the file name to be used for saving the pdf doc, converts the postscript to PDF and writes the PDF file.

Sponsored version of PDF995 pops up a dialog box and a couple of web pages which connect to their web site to display promos. This is a little irritating, but I do not complain, the tool being free.

Yesterday, I came across another free tool that does the same without the pop-ups – CutePDF. It also needs GhostScript to be installed for PS to PDF conversion.

I did a quick test to compare PDF995 to CutePDF using a Powerpoint slide with some text and vector graphics. The size of the output documents was largely the same, PDF995 output being slightly larger. Output quality was also comparable – at smaller font sizes, PDF995 gave slightly better output, especially if the text was bold. For now, I am sticking with PDF995 – I create PDF docs only once in a while and I can bear with a couple of pop-ups for slightly better quality.

Posted on Mar 6, 2006

Ubuntizing my PC – Part 2

In my last post, I wrote about how I installed Ubuntu on my old PII 266. It had a DLink DWL510 PCI card installed on it, but the card was not detected by Ubuntu on installation.

This blog is on my weekend project to get the WLAN card working so that I could connect to my home WiFi and the Internet from the PC.


I sat down to get the card working on Saturday afternoon, and was hoping to get it working by evening.

1. The first step was to run ‘lspci -v’ to see if the card was detected. It was detected as a RaLink device, was listed as ‘not recognized’. Most likely, the driver was missing. Ralink 2500 drivers are auto-detected in Ubuntu 5.10, so I wonder what went wrong. None of the searches I did turned up information suggesting that DLink used another RaLink chipset, and I assumed (Bummer 1) that my card had the same 2500 chipset. I could have quickly confirmed this by checking the Windows driver CD that came with the card, but I did not care.

2. The next logical step was to use NdisWrapper. This is a common driver that implements a wrapper for Windows NDIS (Network Driver Interface Specification) API over Linux kernel so that Windows drivers can run over the Linux kernel. I downloaded ndiswrapper v1.10 using my laptop and copied it over to the PC using a CD.

3. A search in the Ubuntu forum gave a good tutorial on setting up ndiswrapper. First thing to do was to get the linux kernel headers, gcc and make. The Synaptic Package Manager (SPM) that comes with Ubuntu is pretty good – it is very similar to the package manager that comes with Cygwin. A couple clicks, and I had installed all required components.

4. Ran ‘make install’ in ndiswrapper source folder – and here comes the first error: gcc-3.4 not found. Checked SPM, but gcc 3.4 does not come with the Ubuntu install CD. What I had was gcc 4.0. Assuming that gcc-4.0 should not be too different from gcc-3.4, I did the following (Bummer 2).

cd /usr/bin
sudo ln -s gcc-4.0 gcc-3.4

Ran ‘make install’ again, and the error went away.

5. Next error: “error: loadndisdriver.c:error:stdlib.h: no such file or directory”. A quick google search revealed that these are part of the glbc package. ‘glibc’ source and related header are not available with SPM, only the binaries are there with libglib. So I downloaded glbc package for Breezy from Ubuntu site, and copied it to the PC.

6. Ran ./configure in glibc source folder, and it failed with the error “compiler cannot create executable”. config.log gave the actual error as ‘crt1.o not available’. Another google search – libc6-dev needs to be installed. Fortunately, SPM has this package.

7. Ran ./configure again, and it failed with the next error: ‘error: you must have gettext support in your C library or use the GNU gettext library’. SPM has gettext also, and configure succeeded after installing it.

8. Ran ‘make’ and ‘make install’ in glibc source folder, and they went quite well.

9. Ran ‘make install’ again in ndiswrapper source folder, and the driver built successfully.

10. Now, I have to install the Windows driver with ndiswrapper. I loaded the DLink driver CD in the CD drive and checked the driver, and it was showing up as rt61. I had not heard about this chipset so I thought I will do a quick google search on it. Surprise! I should have checked this earlier – RaLink is providing a native open-source Linux driver for rt61 chipset (also called rt2561). So I dropped ndiswrapper and set about getting the native driver to work.

11. I downloaded the rt61 drivers from RaLink, and copied them over to the PC. The ‘readme’ file in the source folder is pretty good and contains clear instructions on building and installing the driver – though not specific to Ubuntu.

12. I have a 2.6 kernel, so I used the appropriate Makefile, and ran ‘make all’ – the driver built successfully.

13. The next step was to run dos2unix on the driver config file (rt2561.dat), but I did not have dos2unix. It is a part of the sysutils package, which does not come with SPM, so I downloaded and copied it to the PC. sysutils is not a single package, but it contains multiple packages – tofrodos (fromdos exactly) is the one to be used instead of dos2unix, and I ran that.

14. Then I inserted the driver using ‘insmod rt61.ko’, and boom came the error! – error: invalid module format. This essentially means that the headers used for building the kernel and the driver are different, but I was not really sure what went wrong. Web searches did not give anything conclusive either.

15. But there is an unofficial version of the rt61 drivers that was available on the internet. I downloaded them, and copied them to the PC. The following steps went well:

copy *.ko /lib/modules/`uname -r`/
depmode -a
modprobe rt2561.ko

Again the same error – invalid module format. 🙁

16. This was unfortunate, and I decided to go back to ndiswrapper – after all, I was almost done on that.

cd /cdrom/drivers/driver/
sudo ndiswrapper -i rt61.inf
sudo ndiswrapper -l

Good – it shows that the driver is present and that the hardware is up and running.

sudo modprobe ndiswrapper

Damn! The same error – invalid module format.

17. There is something that I am missing here – so I ran ‘sudo dmesg’ (should have done that before), and examined the output. That cat was out of the bag – ‘version magic 2.6.12-9-386 386 gcc-4.0’ should be ‘version magic 2.6.12-9-386 386 gcc-3.4’ – the breezy kernel is compiled with gcc 3.4. So I should have installed gcc 3.4 instead of linking it to gcc 4.0. And this is why even the native drivers did not work.

But this is crazy! gcc-3.4 should have been bundled with the install CD if drivers lunked to kernel needed to be compiled with it. Bundling just gcc-4.0 and then expecting people to download gcc-3.4 for compiling drivers is not a good idea.

18. Binaries for gcc-3.4 and other dependancies can be downloaded from Ubuntu web site, but I did not want to get into another round of compat issues, so I decided to download the source and build it. ‘configure’ went well, but I did not select specific components to build (Bummer 3), so the build just went on and on, building Java libraries and a lot of other stuff that I am not sure I needed. At 2am on Sunday morning, I decided to call it quits for the day.


19. The gcc 3.4 build was over by 10am. Ran ‘make install’ and installed gcc-3.4 into /usr/local/bin. It would be safer to have ‘gcc’ default to 3.4 with the kernel built with 3.4.

20. Go to the native rt61 driver, ‘make clean’ and repeated the steps. ‘insmod rt61.ko’, and the interface shows up in ‘ifconfig -a’. Great!.

21. Now I got over-enthusiastic, and took the Network Configuration applet that comes with the Gnome Desktop without reading the driver ‘readme’ file (Bummer 4!). Configured ssid, wep key, and dhcp – whoah! the machine just froze!.

22. Restarted the machine.

copy rt61.ko /lib/modules/`uname -r`/ (Bummer 5!)
sudo depmode -a
sudo insmod rt61.ko

ra0 interface is available now. Used the network config applet to configure ssid, wep key, and a static ip address – but ping to my Wireless router fails. Then I configured dhcp again, and the machine froze again.

23. After this, it was real fun. Every attempt to re-boot will cause a hang at the point where the network interfaces was being configured. Doing a Ctrl-C will cause the boot to proceed, but all ‘sudo’ operations get stuck, so no chance to remove the faulty driver from start-up.

24. Tried loading with the Live CD (oh, the pain! :-(), but could not access the hard disk partitions. After two hours of re-boots and struggles to access the hard disk from the Live CD, I gave up and went down for lunch and watched the India-England cricket match fizzling out to a draw.

25. Then came the flash! There has to be a recovery mode with Ubuntu. I booted up the PC – and there was the Grub menu which would pop up if you press ESC on boot up. There was a recovery mode there – how come I missed it earlier! Boot into the recovery mode, and I am in, with root access.

26. Removed the driver from startup list, and checked the readme file for the driver. It mentions different ways of configuring the card, but the config applet is not among them. So I edit the config file and add ssid, key etc., and re-insert the driver.

‘sudo iwlist ra0 scan’ does not display any AP. After juggling the iwlist and iwconfig for almost an hour, I give up.

27. Removed the native driver, installed the Windows driver using ndiswrapper and tried – same problem – the AP is not seen. It was detected a couple of times in between, but the connection got dropped in a matter of seconds. So I moved back to the native driver.

28. I guess that the problem is with conflict between the driver config file and some existing config so I checked the /etc/network/interfaces. There are some configs for ra0 – IP address, ssid, key etc. – there from my earlier attempts with the network config applet. I removed them and re-inserted the driver – and voila! the AP shows up in the iwlist scan. But for some reason, the ssid etc. are not set in iwconfig. I try to set the id and key explicitly using iwconfig, but no luck. Then I tried to force associate the card by giving ‘iwconfig ra0 ap _ap_mac_ – it is associated. I am able to ping the router, so I added a DNS entry in ‘resolve.conf’ and pinged ‘’. That works as well. I quickly pull up Firefox and type ‘’ – no luck. Issue ‘iwconfig’ again, the card is not associated. But the AP does show up in the iwlist scan. Repeated attempts to force the card to associate with the AP as before does not work.

So near, yet so far!

29. By now, I had given up on DHCP, so I was using static IP addresses after configuring my router for static DHCP (so that it would continue to work with mine and my wife’s laptops which were configured for DHCP).

30. Probably, loading the card on boot-up would make it work. So I copied rt2561.ko to the start-up modules list, and added an entry to /etc/modprobe.d/ (this is different from the instruction in the driver readme file which asks this entry to be made in modules.conf file), restarted the machine and crossed my fingers.

30. Issued ‘ifconfig -a’ on boot-up, the interface was there, and the IP was configured. ‘iwlist scan’ output showed the AP. But ‘iwconfig’ showed that the card was not associated. 🙁 I was at my wit’s end now and like all such unfortunate folks, I began to hope for some magic or miracle. I re-inserted the driver many times, re-booted repeatedly, but to no avail.

31. It was 10.30 pm, it was the end of my weekend, Nisha was cross with me for a spoiled weekend though she was trying her best not to show it, I could not take Safdar out as I had planned, train tickets that I had promised my father to book were still un-booked, and the Wireless card was not working. Time to sleep, to forget, to promise myself and Nisha that I would not work on this during the weekdays – at least.


32. Went up to reserve train tickets for my father on The PC is sitting there, as the symbol of my defeat. I think that I should try one more thing – remove the driver from boot, write a script to insert the driver and configure it, and run the script exlicitly during start-up. Let me try just this one step – and I booted the PC up.

ifconfig -a

Interface ra0 shows up, IP is configured.

sudo iwlist ra0 scan

AP shows up

iwconfig ra0

Surprise! The card is associated!

ping (router)

Ping works!

Start Firefox, type

Whoah! It works! Yahoo!

I browsed for 15 minutes, and re-booted the PC. It works well.

So what went wrong on Sunday? I do not know.

It just needs a good night’s sleep – almost always!

PS: I assume that the card had gone into some race condition during the repeated experiments with the multiple options in iwconfig. And the quick reboot using the restart button probably did not really power it down.

Posted on Mar 4, 2006

CD Slide show generator for Windows XP

CD Slideshow Generator is a Windows XP Power Toy from Microsoft that helps to create a slide-show on your Photo CDs. The procedure is pretty convenient – copy the picture files to the CD using Windows Explorer, right click on the CD Drive icon and select ‘Write these files to CD’, and the wizard that pops up has an option where the slide show can be enabled. It creates an autorun file which runs the slide-show.

My neighbor wanted to transfer the pictures in her Digicam to a CD, and I used this tool to create the slide-show for her. It works well, and is simple to use.

One feature I missed is an option to rotate pictures while they are being viewed – I had to swivel my head frequently to see the pictures taken in vertical format.

Posted on Mar 2, 2006

editing in Microsoft Word is injurious to your blog

I just opened the Atom feed for this blog in my browser, and I was surprised to see this error:

“Reference to undeclared namespace prefix: ‘st1’. Error processing resource”

A quick search revealed that this was caused by editing my post in MS Word before pasting in the Blogger compose window.

Word added special XML indicating the country around the word ‘India’ and that caused the problem. See below:

If you are in <st1:country-region st="on"><st1:place
st="on"><st1:place><st1:country-region>India and want a copy,
drop a note in the comments, I will ship you one asap.

I edited the HTML for the post and removed the rogue tags, and my site feed is fine now.

Moral: Just use a simple text editor to edit your posts, or use the blogger editor itself. Never use MS Word – it is too ‘intelligent’ for human beings.

Posted on Mar 2, 2006

On page file defragmentation

My laptop was spending a lot of time reading its hard-disk while switching between tasks or invoking a new task. I suspected file fragmentation being the cause and ran the Windows Disk defragmenter tool. Some big movie files were heavily fragmented as I copied them over when free space was running low (29000 fragments for one 600MB file!) but all the commonly used files were in tact. Then I noticed the pagefile (pagefile.sys) – it was showing heavy fragmentation, but the Windows Disk defragmenter cannot defrag pagefiles.

That is where Pagedefrag helps. It runs at boot time and can defragment the page file, registry and hibernation file (where the memory snapshot is stored when computer is hibernated). I ran Pagedefrag and it reduced my pagefile from 6900 fragments (!) to 84 – some improvement!. And the performance of my laptop has improved considerably.

Anyways, prevention is better than cure, and there is a way to prevent page file fragmentation from happening. Create a separate partition and use it to store the pagefile similar to the Unix swap space. There would be some performance improvement too if this partition is on a separate hard disk which does not contain frequently accessed files.

There is some good information on improving paging performance at:

Posted on Feb 28, 2006

Ubuntizing my PC – Part 1

In my last post, I wrote about Ubuntu.

Why did I order free CDs from Ubuntu?

1. I like the concept,

2. I want to see for myself what it has to offer in addition what I could get from other distros,

3. I have an old PII 266 desktop PC running Windows 98 and pirated warez, infested with every freeware, virus and trojan out there. I want to clean it up, and make it usable without having to shell out truckloads of money (which I do not have) buying software for it.

4. I am just an amateur user of Linux, and I am strapped of time. So I want a Linux system that is easy to install and use, and will give me true GUI capability, while still permitting me to be adventurous in mucking around with it when I feel like it. Ubuntu seems to fit the bill pretty well.

So, I was pretty excited when I received my Ubuntu CDs, and I set about on ubuntizing my PC right away. Here is a poor log.

  1. Ubuntu comes in two CDs – one is the Live CD and the other is the Install CD. The Live CD permits one to run Ubuntu without installing it – just make the CDROM as the first boot device, insert the live CD and start the PC, Ubuntu loads right away. This permits you to see what Ubuntu has to offer before you actually install it.
  2. So I loaded the Live CD. It looks ages to load (I had expected it to be slow since it was running from the CD, but not this slow), and I ended up with the command prompt and some error messages (fatal server error: no screens found) that indicated that it was unable to find the right display driver though it had correctly identified the SiS card that I had.
  3. Some googling revealed that this could be solved by changing the display driver to a more generic one. This is done by editing the ‘Device’ section of /etc/X11/xorg.conf, and changing the ‘Driver’ entry to ‘vesa’. Type ‘startx&’, and I got the Gnome Desktop. It looks good with the default Ubuntu theme.
  4. Next surprise: my Logitech 3-button serial mouse is not working – the mouse cursor is stuck in the middle of the screen. Again, Google (how would I live without it?) tells me that there are ways to make a serial mouse work on Linux, but I have had enough with the Live CD by now, and I think it will take less time to go ahead and install Ubuntu and move back to Windows if it does not work.
  5. Ubuntu 5.10 (Breezy Badger) comes in two variants – one desktop installation and a server installation. A quick check of the system requirements tell me that 128 MB is minimum for a Desktop installation. I have just that much. There is a risk that it might turn out to be like one of those games which never really run on the minimum configuration, but I decide to go ahead anyway.
  6. I marshal all my good data into one folder, install the drivers for my Genelink USB host-host cable on the PC, connect my laptop with the PC using the USB cable. The folder has 11GB of data, so I issue a copy and goes down to have a tea. When I come back, I see that the copy has aborted with ‘Bulk write error’. My impatient mind assumes that the USB driver is unable to handle large amounts of data and curses it. So I take the HDD out, put it in a USB casing that I bought for the DVD writer, and connect it to my laptop. The Windows XP on the laptop detects it as a hard disk without my having to install any drivers – I like it. Then I issue the copy and sit to watch over it.
  7. The copy aborts – but this time, the cat is out of the bag. The folder contained some executables compiled from the source that I had written, and I included them assuming that they were virus free. But it seems that they are infested with the Spaces.1445 virus that appends itself to every executable that is invoked. My Symantec Antivirus was blocking the copy operation – and that is why it aborted.
  8. So I run the antivirus on all files the I want to back up – fortunately, there is nothing other than Spaces which is cleaned by Symantec AV. So I copy the files over to my laptop.
  9. Then I insert the install CD and boot the PC. After a couple of minutes, I land in the partition manager. The default values allocated by the partition manager do not fit me. After a couple of hiccups when I first forgot to create a root partition and then a swap partition, I get it right. The hard disks are erased and partitioned.
  10. The rest of the installation is uneventful – I just have to create a username and password. One thing that is quite different in Ubuntu from Slackware/Redhat/Cygwin distros that I have used is that the root user and password is not available. You log in with your username and use ‘sudo’ command to run tasks that need root permission (eg:- sudo vi /etc/X11/xorg.conf) – you would have to give your password every time for this. I am prone to the bad habit of logging in and working as a root on unix machines for the sake of having access to everything, and I have deleted valuable stuff inadvertently in the past, so this restriction fits me quite well.
  11. I have to re-boot once before the installation completes, and then I land into the command prompt because of the display issue that I saw the the Live CD. I change the driver to ‘vesa’ as above, and the Gnome desktop is working.
  12. To make the mouse work, I edit the InputDevice section of /etc/X11/xorg.conf, change the “Device” entry to /dev/ttys0 and the ‘Protocol’ entry to ‘Auto’. It does not work, and I find that it should be ‘/dev/ttyS0’. Still no luck, I check the COM ports, and find that the 9- pin port where the mouse is connected is on COM1. So I change the ‘Device’ entry to /dev/ttyS1’, and the mouse works now. Still, it takes some time for the mouse to start working after I get the Desktop, and when I exit X, I can see a message that it could not find the mouse protocol. So I change the ‘Protocol’ entry to ‘Microsoft’, and the delay is gone.
  13. Now, I have a good desktop OS running – Open Office, Evolution and Firefox are among the default applications. There are 4 workspaces, and the links are conveniently grouped into Applications, Places and System menus. The icons are placed conveniently, even a novice should be able to use it without much heartburn.
  14. There are some issues though. The application windows are always bigger my viewing area, and I have keep moving them around to get access to all parts. Need to check the desktop resolutions and fix this up. OpenOffice is slow, but usable – should be okay given that it is a 266MHz processor.
  15. I have only a PCI wireless LAN card for network connectivity – and it is not detected. I need to figure this out too.

I have come to the end of Part 1 and I now have an ubuntized PC with a nice GUI with X Windows, without network connectivity. In the next part, I will get the wireless up and get it to work with WEP on my home wireless network so that I can access Internet. Till then, good bye! 🙂

The following links were very helpful to me:

  1. Ubuntu Wiki:
  2. On display driver issues on Ubuntu:
  3. Getting a serial mouse to work: