Archive for the ‘General’ Category

Feed reading – Liferea, Google Reader

Liferea 1.2.10 actually rocks. It does duplicate detection in posts, and makes my feed reading life, a lot easier.


Liferea Duplicate Feed Detection at work

It also supports bookmarking with del.icio.us, which is nifty. It feels a lot faster than the previous version. It had some internal database change and did the honorable thing and keep a backup of the old one, in case I was planning on rolling back.

All my technical related feeds have been imported into Liferea, and I’m a happy camper. In my idea of making NetNewsWire even more useless to me, I’ve moved all photographic related blog reading to Google Reader.

Now, thats a nice piece of software. With access to the Internet, I can read my feeds and have them always “synced” – i.e. I’m never going to have to read an entry twice, or anything of that sort. The only caveat is that I need to actually be online.

So while it’s handy to read Google Reader feeds while I’m sitting in a shop waiting for my take-out, its also pretty darn expensive. I’m paying something like $4.95 for less than 5MB of traffic per month I think (or maybe its 10MB), with Optus.

Does there exist software to read Google Reader offline (Linux preferred, Mac OS X is OK, Windows tolerated)? Do Series 60 Nokia phones have such ability? I ask because soon I’ll not only have the N73, but an E61i (which has WiFi). If only Liferea read/synced with Google Reader, then I can move all my feeds to a cool backend.

Technorati Tags: , , , ,

OS X inside some virtualized environment

Dear Interweb,

Is there a way for me to install an instance of OS X inside of Parallels or VMWare Fusion? I currently use VMs for QA work, and when I want to QA packages on OS X, I don’t want to always reinstall clean environments on a test Mac. And yes, you can find problems in clean room environments – like MyODBC.

Has PearPC improved, in terms of speed? Or is it dead?

(no, I don’t care if Apple’s licensing is still retarded. This is to make my life easier.)

Technorati Tags: , , , ,

Interview with Bruce Momjian (founder, PostgreSQL)

I found the recent interview with Bruce Momjian (founder, lead architect, PostgreSQL) rather interesting. From it I took away:

  • PostgreSQL has stringent quality assurance. This is because there isn’t the “luxury of putting out a bad release”. He mentions that in the world of open source, there is zero tolerance for things that don’t work; I however can find many examples contradicting this line of thinking. Release engineering is largely dependent on humans and they do make mistakes.
  • “People are more confident with us that some of the commercial databases.” I believe this largely is how your company is run – tech-oriented or suit-oriented. Worse if you’re suit-oriented and largely public. Investors and upper management need to blame someone when things go wrong, and thats why support services are so great in the open source world. Accountability is key. The ability to fix customers problems is key. Going off on a tangent, Michael Meeks, distinguished engineer at Novell and OpenOffice.org hacker extraordinaire basically said:

    Ubuntu, claiming to ship and support OpenOffice.org, it’s a total joke – they have a part-time packager. At Mandriva, for example, the OpenOffice.org packager is a self-described ‘not a C++ programmer’. So how you can then go and say ‘we’ll support you’… Novell, at least, has people across the board working on the codebase, with a good understanding of lots of issues.

  • Bruce mentions evolution – from stopping PostgreSQL crashing, to performance tuning, to enterprise features. He reckons that 8.2 is Enterprise Ready, and 8.3 and forward is going to include “revolutionary features that go beyond things you can’t do with other databases”.
  • My favourite quote:

    If you look in the next five years, PostgreSQL will be a poster child for databases period,” he said. “There is not really another database that’s enhancing at the speed of PostgreSQL, so what that would look like is hard to say.”

  • Pretty bold statement, eh? No roadmaps, like most open source projects. I actually think that’s a plus point, because roadmaps suit suits, but are completely inaccurate most of the time.

Technorati Tags: , , , , ,

rdiff-backup is my backup tool of choice

I decided to actually get backups going. I know, laugh. But I bet that when you snicker, you may also not have a great backup system in place.

Picked up a 160GB 2.5″ disk and an external casing. After careful calculation, it seems like maybe I could have saved money buying a pre-packaged solution. Weird.

Anyways, the tool of choice – rdiff-backup. Its simply dead easy to use. Just do: rdiff-backup <current> <backup-path>. My initial backup of about 56GB of my home directory, it took just under 2 hours for the first ever backup image.

Restore is something a lot of folk seem to forget. They make great backups, but never test restores. I went cold turkey – moved from Fedora to Ubuntu after a backup. It worked.

Only real problem is that in Fedora, I had uid/gid 500, but in Ubuntu, I was uid/gid 1000. That’s easily fixable, I restored (rdiff-backup -r), and I have my environment exactly as per before the format (and distribution switch).

I probably should try to do these backups daily, and maybe have even more that one drive for backups, but this is me being very appreciative of rdiff-backup. Today I ran it again after over a week.

time rdiff-backup /home/byte/ /media/disk/byte/

real    288m40.943s
user    52m1.951s
sys     12m57.641s

rdiff-backup also works over the network. Those dreamhost accounts are now starting to look very interesting for off-site storage. Read the documentation and examples if you’re wanting to get up to speed really quickly. Its also OS X compatible. Now if only I found a sensible Windows backup system…

Technorati Tags: , , , , ,

bytebot.net gets plain jane html redesign

I figured it was time to redesign, what was essentially 2004, on bytebot.net. Its 2007, so with the power of “badges”, I’ve made the site feel a lot more updated. I’m feeding photos from my Flickr account, last played music via last.fm (I’m not really big on their widget, but I’m trying hard to not have to parse rss myself), bookmarking via del.icio.us (with comments, so its also occasionally blog-like), and micro-blogging via Twitter.

The fact that I can still do all this entirely in HTML (with great help from JavaScript and some help from CSS), still amazes me – who needs PHP! For constantly static text, Apache does server side includes, something I’ll be using in due time, when I fix/template all the local content. I have to update the talks page (so many Impress files sitting all around).

Anyways, if you’ve got a Twitter account, I’m bytebot there. Who else is twittering?

Dell collaborates with Microsoft/Novell – 2007 is definitely the year of desktop Linux

Last week, Dell was getting in bed with Ubuntu, this week there’s a Microsoft/Novell deal. The terms are interesting: Dell purchasing SLES (SuSE Linux Enterprise Server) certificates from Microsoft. Then, there’s the existing Dell Linux customer base – this deal is meant to market and offer services to migrate them to SLES!

Two platforms of the future – Microsoft Windows and Linux. These are also the two platforms for today. Dell will focus on interoperability workshops, migration proof of concepts, and provide migration services. I wonder how much of these will hit Dell or Novell partners, offering similar services. Watch the video-blog on Dell’s blog.

Questions that remain to be answered, and will be interesting to see unfold:

  • Where does this leave Red Hat? Is something interesting going to be announced at the Red Hat Summit happening May 9-11?
  • HP is a big supporter of Linux, with Debian. Where is the synergy with them to move the Ubuntu or even the Novell way?

A lot of folk have predicted many years in the past were the year of desktop Linux adoption. I think 2007 is the year of desktop Linux adoption. More importantly, I think I’m right.

Technorati Tags: , , , , , , ,


i