Pages

Advertisement

Wednesday, July 4, 2007

Web Browser in C# and VB.NET

There are many questions on discussion forums about creating a web browser or using existing web browser ActiveX control in C# or VB.NET. This article explains how to add and web browser ActiveX to your project to developer your own customized web browser.

I wonder why Microsoft didn't add a class to provide browser functionality. If they did, then I'm not aware of it. Any way .. In this article we'll use existing Web Browser control.

Adding Web Browser ActiveX

Create a Windows application and right, right click on toolbox window, and select "Customize Toolbox". In COM components, you'll see "Microsoft Web Browser" component an dll is "Schdocvw.dll".



Clicking OK button adds "Explorer" control to your toolbox. See toolbox below.



Now you drag this "Explorer" control to your form. The default name of the control is "axWebBrowser1".

Designing GUI

Now I add a toolbar with few buttons on it. You can see my toolbars tutorial to see how to add toolbar buttons, load images and write event handlers for toolbar buttons.

Besides toolbar, I also add a URL text box, a button and organize my form so it would look like the below fig.



Home, Previous, Next, Stop, and Refresh toolbar buttons are self-explanatory and provides same functionality as a browser does. Go button loads the specified URL in the browser control.

Writing Code

Now I write the code on "Go button" click and toolbar buttons. In a moment, you'll see how you can customize your own browser writing few lines of code. The Navigate method of browser control views a page in the viewer. Other methods are pretty simple and self explanatory such as GoHome, Stop, Refresh, GoBack and GoForward.

Source Code: C#

private void button1_Click_1(object sender, System.EventArgs e)
{
System.Object nullObject = 0;
string str = "";
System.Object nullObjStr = str;
Cursor.Current = Cursors.WaitCursor;
axWebBrowser1.Navigate(textBox1.Text,
ref nullObject, ref nullObjStr, ref nullObjStr, ref nullObjStr);
Cursor.Current = Cursors.Default;
}

Here is code for toolbar button click.

private void toolBar1_ButtonClick(object sender, System.Windows.Forms.ToolBarButtonClickEventArgs e)
{
if ( e.Button == tb1 )
{
axWebBrowser1.GoHome();
}
if ( e.Button == tb2 )
{
axWebBrowser1.Refresh();
}
if ( e.Button == tb3 )
{
axWebBrowser1.GoBack();
}
if ( e.Button == tb4 )
{
axWebBrowser1.GoForward();
}
if ( e.Button == tb5 )
{
axWebBrowser1.Stop();
}
}

Source Code: VB.NET

VB.NET code is nothing else but a conversion of C# code. Here is the code navigates the URL using Web Browser's Navigate method.

Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button1.Click
Dim nullObject As System.Object = 0
Dim str As String = ""
Dim nullObjStr As System.Object = str
Cursor.Current = Cursors.WaitCursor
AxWebBrowser1.Navigate("http://www.microsoft.com", nullObject, nullObjStr, nullObjStr, nullObjStr)
Cursor.Current = Cursors.Default
End Sub

You can call Stop, GoHome, Refresh, GoForward and other methods in the same way we did in C# code.

The Application

The GUI of the program looks like the following image. Go button navigates the URL and other buttons are pretty self-explanatory.

Chkdsk : CheckDisk Command

Creates and displays a status report for a disk based on the file system. Chkdsk also lists and corrects errors on the disk. Used without parameters, chkdsk displays the status of the disk in the current drive.

Syntax :
chkdsk [volume:][[Path] FileName] [/f] [/v] [/r] [/x] [/i] [/c] [/l[:size]]
Parameters
volume:
Specifies the drive letter (followed by a colon), mount point, or volume name.
[Path] FileName
Specifies the location and name of a file or set of files that you want chkdsk to check for fragmentation. You
can use wildcard characters (that is, * and ?) to specify multiple files.
/f
Fixes errors on the disk. The disk must be locked. If chkdsk cannot lock the drive, a message appears that asks
you if you want to check the drive the next time you restart the computer.
/v
Displays the name of each file in every directory as the disk is checked.
/r
Locates bad sectors and recovers readable information. The disk must be locked.
/x
Use with NTFS only. Forces the volume to dismount first, if necessary. All open handles to the drive are
invalidated. /x also includes the functionality of /f.
/i
Use with NTFS only. Performs a less vigorous check of index entries, reducing the amount of time needed to run
chkdsk.
/c
Use with NTFS only. Skips the checking of cycles within the folder structure, reducing the amount of time
needed to run chkdsk.
/l[:size]
Use with NTFS only. Changes the log file size to the size you type. If you omit the size parameter, /l displays
the current size.
/?
Displays help at the command prompt.
Remarks
􀁺 Running chkdsk
To run chkdsk on a fixed disk, you must be a member of the Administrators group.
􀁺 Checking a locked drive at restart
If you want chkdsk to correct disk errors, you cannot have open files on the drive. If files are open, the
following error message appears:
Chkdsk cannot run because the volume is in use by another process. Would you like to schedule
this volume to be checked the next time the system restarts? (Y/N)
If you choose to check the drive the next time you restart the computer, chkdsk checks the drive and corrects
errors automatically when you restart the computer. If the drive partition is a boot partition, chkdsk
automatically restarts the computer after it checks the drive.
􀁺 Reporting disk errors
Chkdsk examines disk space and disk use for the file allocation table (FAT) and NTFS file systems.

Chkdsk provides information specific to each file system in a status report. The status report shows errors found in the file system. If you run chkdsk without the /f command-line option on an active partition, it might report spurious errors because it cannot lock the drive. You should use chkdsk occasionally on each disk to check for
errors.
z Fixing disk errors
Chkdsk corrects disk errors only if you specify the /f command-line option. Chkdsk must be able to lock the drive to correct errors. Because repairs usually change a disk's file allocation table and sometimes cause a loss of data, chkdsk sends a confirmation message similar to the following:
10 lost allocation units found in 3 chains.
Convert lost chains to files?
If you press Y, Windows saves each lost chain in the root directory as a file with a name in the format Filennnn.chk. When chkdsk finishes, you can check these files to see if they contain any data you need. If you press N, Windows fixes the disk, but it does not save the contents of the lost allocation units.
If you do not use the /f command-line option, chkdsk sends a message if a file needs to be fixed, but it does not fix any errors.
If you use chkdsk /f on a very large disk (for example, 70 gigabytes) or a disk with a very large number of files (for example, millions of files), chkdsk might take a long time (for example, over several days) to complete. The computer is not available during this time because chkdsk does not relinquish control until it is finished.
z Checking a FAT disk
Windows displays chkdsk status reports for a FAT disk in the following format:
Volume Serial Number is B1AF-AFBF
72214528 bytes total disk space
73728 bytes in 3 hidden files
30720 bytes in 12 directories
11493376 bytes in 386 user files
61440 bytes in bad sectors
60555264 bytes available on disk
2048 bytes in each allocation unit
35261 total allocation units on disk
29568 available allocation units on disk
z Checking an NTFS disk
Windows displays chkdsk status reports for an NTFS disk in the following format:
The type of the file system is NTFS.
CHKDSK is verifying files...
File verification completed.
CHKDSK is verifying indexes...
Index verification completed.
CHKDSK is verifying security descriptors...
Security descriptor verification completed.
12372 kilobytes total disk space.
3 kilobytes in 1 user files.
2 kilobytes in 1 indexes.
4217 kilobytes in use by the system.
8150 kilobytes available on disk.
512 bytes in each allocation unit.
24745 total allocation units on disk.
16301 allocation units available on disk.

Although I don't intend to keep daily journals of my trials of various Linux distributions this summer, I will chronicle the first day I spend with each distro. The first day with any new operating system (or variant) is a day when "first impressions" are conceived and ultimately, judgment is made (I know, I know, one is not to judge anything at first glance, but who doesn't form a bias for or against something after the first encounter?). And so, after slight delay, I start my adventure into the wide (wild?) world of Linux distributions: First stop, Fedora 7.

The Install:

Installing Fedora was very straight forward. After choosing my default language and keyboard layout, I was met with some partitioning options. Opting for a "custom setup", the partitioner that the Fedora installer provides leaves little to be desired for a basic install. I was able to select which disk partitions I wanted to use, which of these I wanted to format, and where I wanted each partition to be mounted. I chose to use my home partition from my Ubuntu install, and everything appeared to work well.

Along the install process I was also able to chose whether or not to install a boot loader. I chose yes, and was presented with options on adding other distros to boot. By default, it detected Windows on my first hard drive, but failed to notice Ubuntu. I added the root partition where Ubuntu was installed on to the list, but upon boot, I did not see an entry for Ubuntu in the GRUB menu. This was not a huge problem as I was easily able to manually edit the GRUB menu.lst file and add an entry for Ubuntu. For first timers to Linux, the most important issue was that the installer detected Windows, and allowed for an easy dual boot setup.

As with all installs, I was asked which timezone I was in after which I was asked to set the root password.

Moving on, I was offered to customize my package selection. Choosing to do so, I was able to select or de-select large package groups, such as games, office productivity, editors, and others. This step also presented me with an option of which desktop environment to install. I generally like to see a more detailed and customizable approach to package selection, as openSUSE and other distros provide.

EDIT: Upon reviewing the installation process in a VMware virtual machine, I noticed that one can in fact choose exactly which packages to install. This can be done by click on "Optional Packages."

Including configuration, Fedora 7 took a little over half and hour to install.

Overall, the installer was very simple to use, but also surprisingly powerful. Instructions were always readily available and one could read the release notes at any time.

Initial Boot:

As I mentioned before, the installer did not manage to add Ubuntu to the GRUB menu, however I was able to load Fedora without any problems.

While the OS was loading, I notice that my screen was way off, and that a good 2 or so inches were off the screen. Adjusting my monitor did not help this problem. Apparently my resolution was not detected and the nVidia drivers were not installed.

Next problem came when startup tried to activate my network connection, which it thought was an Ethernet connection. It took forever to realize that it just wasn't going to get ip information from a non existent connection, and finally just [FAILED].

The setup following installation held no surprises. I was asked if I wanted to configure a Firewall and if I wanted to enforce SELinux. After this I was asked to set the date and time. Next came a screen outlining my hardware profile which I was asked to send in to Fedora to help with development. Since my internet connection did not work at that point, I had to choose not to send the information. Then came user creation and finally a test of my sound card (it worked).

On attempting to log on I was presented with a wonderful error saying that I didn't have permissions to my own home directory. This did not let me log on, and even made X crash. Interesting error considering I just installed the operating system. I messed with a few permission but nothing worked. Then... it dawned on me: I shared this home partition with my Ubuntu install and I have the same user name with both. So, it created the new "linnerd40" folder in the home partition over my other "linnerd40" folder from my Ubuntu install. However, the "linnerd40" folder was still only accessible to Ubuntu. Great. Since time was running rather short, I decided to go for another install, this time just letting the root and home partitions be the same (not the way I like to set stuff up). This worked.

Before going any further, I added Ubuntu to the GRUB menu.lst file so that in the case of an emergency, I had at least one stable operating system to boot into. I rebooted and tested going into Ubuntu. Everything worked, until login. I received the same error as I had when I tried to log into Fedora. Apparently, when tampering with the permissions in Fedora, I had screwed up access to my own home folder in Ubuntu. I messed with some more permissions and ended up fixing the problem (with some help from the Internets) using the following commands:

sudo chown -R linnerd40 /home/
sudo chmod 700 /home/

Yay for the command line! Long story short, Ubuntu and Fedora now work.

First Impressions:

After a successful login, I was greeted by a fairly decent looking Gnome desktop. The new "Flying High" theme is not going to be winning any awards but appeals more to me than Ubuntu's "Human" theme. First on my list of problems to fix was the screen resolution. After pulling the latest copy of nVidia's Linux driver from my flash drive, I killed X and went into run level three (run: /sbin/init 3) for the install of the driver. However, installation failed when it detected that gcc-devel was not installed. So, I got back into X and searched for an application for installing packages. I found an "Add or Remove Programs" entry in one of the menus and tried that. However, it gave me an error saying that package information could not be retrieved due to lack of a network connection. I popped in the Fedora 7 DVD and tried installing packages from there. I found the .rpm file I needed in the FEDORA directory on the DVD, but upon trying to open the file to install it, I received the same error. This was extremely aggravating as installing from a .rpm file that was present on my hard drive (I copied it from the DVD) should not require a network connection. So, I went with the command line method of:

rpm -ivh package.rpm
This worked... but immediately I found myself in dependency hell. To install gcc, I needed glibc, but I also needed glibc-devel which needed glib-headers which needed the kernel-devel package. Perhaps that wasn't quite the order, but needless to say, I was searching for and install packages for a quite while. RPM dependency hell was why I stopped using SUSE. Apt is a much more efficient method of package management and I don't see why a distro wouldn't use it.

EDIT: Upon reevaluation of Fedora 7 in my VM (with working Internet), I see that some of what I said above is unjust. Yes, RPMs do have a tendency to lead to dependency hell, as I experienced much with SUSE and previous versions of Fedora. However, yum (the package manager used in Fedora) does handle dependencies quite well, much better than I had remembered. A simple:

yum install gcc
fixed my problems. Still, I prefer apt/ Debian style package management over RPM any day.

After going through hell to get all the packages I needed, I was finally able to install the nVidia driver. I then set my screen resolution using the nVidia- xconfig tool and was well on my way to a more pleasant desktop experience.

The next problem I wanted to tackle was wireless support. Although my card was detected (rt2500 chipset), it was impossible to configure it correctly. Using this guide I was able to get very close to success, but I continued to get errors when trying to activate the device. As of yet, I have not found a fix.

So far... :

So far, my experience with Fedora has been less than enjoyable. However, I hope that after spending a week with Fedora, I will change my mind. It seems like a very stable and thought-out distribution. The default package selection is excellent using Firefox for web browsing, GIMP for image manipulation, Pidgin for instant messaging, Rhythmbox for multimedia playing, and many other stable software selections to fulfill the daily needs of any average computer user. The Fedora team has made a great effort to provide a usable, friendly installer while allowing for advanced configuration and has done so superbly. Back when I first started with Linux, Fedora Core 4 was one of the first distros I tried to install. I had to give up on it since my wireless card was not detected, and at the time I did not know how to fix such problems. Fedora has definitely evolved since Core 4, and I am certain that once I get my wireless card working I will be able to see its true power.

More On Fedora 7: Wireless Woes and Second Opinions

fter my second day using Fedora 7, I believe that enough of my opinions have changed to warrant a second post about the distro. Lets jump right in:

Wireless Woes:

Still no wireless Internet. This is becoming a rather vexing problem, as I have yet to find a solution to what may be the biggest problem I am experiencing with Fedora 7. After trying a multitude of drivers, both from the rt2x00 project (rt2x00.serialmonkey.com) and the official Ralink Linux drivers, I have yet to come upon a driver that works (some don't even compile) and is properly recognized. A quick Google search for "rt2500 fedora 7" shows that I am not the only one with this problem. The guide on the "Life With Linux" blog looked very promising, however when I try to activate the wireless device I get the following error:

rt2500 device wlan0 does not seem to be present, delaying initialization


This error just won't go away, and seeing as I cannot accomplish much without a working Internet connection, I have had to resort to "Plan B" for now:

A New Testbed:

Since Fedora 7 stubbornly refuses to allow configuration and activation of my wireless card, I have gone ahead and installed Fedora 7 in a virtual machine using VMware Workstation. In doing so, I now have a working internet connection. Until I get my wireless issues worked out on the physical install, I will be using the virtual machine off my Ubuntu install. Hopefully, I can in this way more justly review Fedora 7.

Package Management:

My last post has received a number of comments criticizing my criticizing of the RPM package management method. I must admit that bad experiences with SUSE and RPM in the past have made for my biased view against RPM. My comments on the system where perhaps not fully justified as I have yet to truly experiment with Fedora's "yum" system. This system, as a reader pointed out, is pretty much apt-get for RPM. After some experimenting in my virtual machine, I must say that yum is doing an excellent job of managing dependencies. However, I have yet to try to install applications I randomly grab from the internet (these were the ones that often threw the weirdest dependencies at me in SUSE).

Another aspect of package management that many people fail to consider is the repository. Repositories are where your packages come from, so to say. They are places where people have created huge compilations of applications, and (hopefully) their dependencies, for you (the user) to chose from (think of them as apple trees, and the packages are the apples). A good repository means a pleasant experience finding and installing packages. Ubuntu has a wealth of excellent repositories which encompass nearly ever package available for the distribution. Rarely must I go out and find a dependency for a package I want to install. Say I am compiling from source, and I need a specific library to properly compile the package. I have always been able to simply apt-get the library instead of having to search for the library and compile it from source. I am hoping that Fedora will be the same way.

Tomorrow I will begin the journey to find the best repositories for Fedora 7. When I have found these, I will proceed to test the RPM system and uncover the true power of yum. I am hoping that I will end the week with a more informed opinion of RPM and Fedora 7.

One More Annoyance:

One truly annoying error I keep getting when using my physical install of Fedora is the inability to use a GUI for installing RPMs, even when straight off the Fedora DVD. The error is apparently linked to my non existent internet/ network connection.



I receive this error even when selecting an RPM that I have right in front of me, as in...on the Fedora DVD. Perhaps this again is a case of a poorly configured repository (maybe it doesn't realize that the DVD is there to be used). I will see about fixing this tomorrow if I can find where the repositories are configured in Fedora (something like apt's sources.list file?). Still, one would think that such a situation be accounted for automatically.

What I'm Liking:

Fedora feels... nice. Not sure how to describe it, but it feels elegant. Not over done, but with noticeable attention payed to detail. Fonts are clear and crisp, colors are appealing to the senses, and even the "Flying High" / bluish theme is really growing on me (I have however changed the desktop wallpaper). Also, performance is noticeably snappier than Ubuntu. Applications open quickly and respond smoothly and instantaneously. Windows dragged around update position at once, leaving no trail behind them (this is a problem I have been recently experiencing in Ubuntu). Overall, the distro's look and feel is very professional but light enough to fit in any environment.

More on Fedora 7 in later posts!

Yay for yum and yumex!

It seems as though my postings on Fedora 7 have become a daily occurrence now. There is much to say, and the more time I spend with Fedora 7, the more I like it!

I believe my bias against RPM is beginning to leave me, and I am beginning to see that RPM is a very viable package management system. The reason for this sudden support of RPM is yum. Yum is awesome. I could leave it at that... or continue. Let's continue, with a bit of history to start stuff off.

Part of the reason I like the Debian method of packages management is because of apt. Apt makes installing and updating so incredibly easy, I never have to worry about dependencies or anything of the sort. I just "sudo apt-get install package" and its done.

When I was working with SUSE back in the 9.0/ 10.0/ 10.1 days, Yast was the only viable method I had for installing packages. Needless to say, it often didn't work out too well. Its then lack of support of gpg keys and rather poor mirror/ repository management made finding myself in dependency hell a commonplace occurrence. After moving to Ubuntu, I didn't think I'd ever try RPM style package management again. Until now.

Like I said, yum is awesome. Yum and Fedora 7 have really made me reconsider RPM based distros. Not only is yum extremely easy to use, but it also handles dependencies excellently. This again probably has to do with the repositories too, but so far I have not come across a package that I couldn't install due to dependency conflicts. The Fedora 7 package installer is also excellent, although a better application to manage your packages is Yum Extender:

Yum Extender, or yumex for short, is a great extension to yum. Just as synaptic is a GUI to apt, Yum extender is a GUI for yum. It is a very powerful GUI which lets you easily select what repositories to use (and not to use), install, update, remove packages from list of available packages, and quickly search through all packages. If you doesn't feel quite confident with CLI yum, but want more features than the standard Fedora package installer, yumex is the answer.
Installing yummex is just about as easy as managing packages with it! Simply yum it:

su
yum install yumex

Although a simple screenshot doesn't do it justice, here is hint of what yummex has to offer:



UPDATE: There is one downside of yumex that I failed to mention before. Fact is, yumex is slow. It just will not deliver top-notch performance. This is perhaps its only downside, but one with fairly major implications if you are one wanting instant gratification. Still, yumex is an excellent GUI for Linux newcomers and is great for looking up that occasional obscure package or getting information about available updates.

As for repositories, I have found rpm.livna.org to be excellent. Anything that isn't included in the default Fedora repositories can be found here. That means that through livna you can find packages enabling mp3 and dvd playback, along with the new NTFS driver for read/ write support of your NTFS/ FAT32 disks (a HOW-TO on enabling these features in a later post).

I'm liking Fedora 7 more and more now that I have it fully working in my VM. I continue to customize Although my wireless problems remain unsettled in my physical install, I must say that I could have done more research on the topic before installing. My bad I guess, although full wireless support right out of the box would have been nice :)

Fedora is shaping up to be an ever more excellent distro. I would definitely recommend it so far, although perhaps not to complete beginners with Linux as there is still a bit of tweaking that goes into getting everything just right. But, as far as that goes... there is nothing that can't be fixed with community help :-)

Open-Source R500 Driver Released


The very first (and very rudimentary) open-source Xorg driver for the ATI Radeon X1000 "R500" series has been released! However, before downloading it, this driver only contains code to initialize and set video modes on the Radeon X1300 to X1600 graphics cards. RandR 1.2 support for the R500 driver is being worked on and may surface shortly. Their current road-map is for getting the Radeon X1600 to X1900 series initialize using this driver, add the RandR 1.2 support, add simple 2D acceleration, work on R500 3D reverse engineering, and implement TTM DRM for memory management. Today's first open-source driver release for the R500 series is available through git on FreeDesktop.org. As this driver progresses we will provide additional information and ultimately benchmarks. The release announcement can be read on the Xorg list.
Great to hear that something is being done about the horrible state of ATI Linux drivers! Although the drivers won't be bringing you the latest and greatest 3D acceleration, this is a very important step towards full ATI card support in Linux. As it stands, cards from the X1300 series up to the X1600 work:
The code released today is able to initialise and set video modes on rv515 and rv530 (X1300 up to X1600); we still lack proper initialisation for r520 & r580 (X1800 and above, some X1600) because of lack of time and hardware.
On the roadmap:
  • Find out missing bits for r520 and r580 hardware initialisation
  • RandR 1.2 support with a dumb memory allocator
  • Simple 2D acceleration (we will put more focus on 3D acceleration as now Xorg provides infrastructure to best utilise 3D drivers to display the desktop, thanks to the Glucose interface)
  • 3D reverse engineering: We believe that this engine is very similar to the r300 3D engine which has already mostly been reverse engineered
  • TTM DRM driver for proper memory management
  • and likely port the driver to new DRM modesetting work.
Sounds good! I can't wait until ATI cards are once again viable options when using Linux. I had been eying the X1650PRO for a while, as it often delivers superior performance to nVidia's 7900GS. However, I guess I'll wait for a 8800GTS :-)

Once again, read the official release announcement on the Xorg list.

So Close, Yet So Far...

Today I once again tried to get my wireless card working in Fedora 7. Still no success, but I believe that I am very near to a solution.
The Linux drivers I was using for the card just weren't working... so why not try the Windows drivers? Using ndiswrapper, I successfully installed the Windows drivers for my wireless card which I got of the driver CD. This was actually extremely simple. After installing ndiswrapper, I found the necessary .inf and .sys files on the Windows driver CD required for installation. To get the driver installed I merely issued the following command in the directory of the .inf and .sys files:

# /usr/sbin/ndiswrapper -i rt2500.inf
After that I ran

#/sbin/modprobe ndiswrapper
Just to make sure that the driver was loaded. After this I opened up the /etc/modprobe.conf file and added the new line:

alias ra0 ndiswrapper
I then proceeded with configuring the card through the Network configuration tool. The card was properly recognized as ra0. After configuring the card, I hit activate and crossed my fingers...
Well, it failed. BUT, it didn't give me the error this time saying that the card wasn't present. It just wasn't able to retrieve any ip information.
I am really hoping that this has brought me closer to solving my problem (which I think it has), but it has also brought me to a sort of road block. It seems as though I have everything configured properly, and apparently the card is detected and it is configurable. So what is missing? What is going wrong? Here are some screenshots of my current situation:




If anybody has any help to offer, I would appreciate it :-) See my thread @ the Fedora Forums.

Enable Complete Media Playback in Fedora 7

As we all know, Fedora 7 ships without support for playing MP3s, DVDs, and many other media types that we are exposed to every day. The default repositories don't offer much help with this problem, but luckily it is an easy one to fix.

First, we must add the Livna repository. This can be done through the following command issued as root:

rpm -Uhv http://rpm.livna.org/livna-release-VERSION.rpm
The Livna repository provides an excellent array of packages to satisfy most all your needs.

To install all the packages necessary to enable MP3, DVD, and other media playback, issue the following command:

yum -y install totem-xine totem-xine-plparser rhythmbox mplayerplug-in mplayer mplayer-gui xine-lib-extras-nonfree libdvdcss libdvdread libdvdplay libdvdnav lsdvd libdvbpsi compat-libstdc++-33
This method was found through an excellent guide on the Fedora Forums. PLEASE READ THROUGH THIS GUIDE. I could reproduce it here, but it would simply be a waste of time as it works splendidly as it is, and will answer all your questions. Check it out to satisfy all your media cravings!

Fedora 7: A Final Look

The time has come to say goodbye to Fedora 7. Over a week has gone by now since I installed the OS on my hard drive and later on a virtual machine. Lets take a short look and see just how Fedora 7 fares as a desktop distribution.

Installation:

Installation of Fedora 7 was a very nice experience. The installer is simple enough for almost anyone to use, but still provides enough power for even advanced users to be satisfied. Although the partitioner offered in the install is not quite as "pretty" as the one the Ubuntu offers, it does seem to have a few more advanced features and certainly does its job very well.

Unlike the Ubuntu installer, the Fedora 7 installer allows for customized package selection. This is a very important feature considering that after installation, if an Internet connection isn't present, software installation (through the package manager) is not possible. In my opinion, this option for customization alone puts the Fedora installer above Ubuntu's.

Overall, installation is a simple procedure that shouldn't take much longer than about 45 minutes, but depends on your package selection.

Hardware Detection:

I never managed to get my RT2500-based wireless card working in Fedora 7. I tried nearly every driver available, and still did not get a connection. The card was always detected but was I was never able to activate the device. I know that out of the box, it is a known bug that Fedora 7 will not allow activation/ proper configuration of rt2500-based card. However, it surprised me that none of the drivers I tried worked... not really sure if it was something that I was doing wrong, or just a stubbornness on Fedora's part. In any case, I am sure the issue will be resolved soon (hopefully in a future update).

Aside from my wireless card not working, Fedora 7 properly recognized all my hardware without any problems. Still, Ubuntu recognized all my hardware, including my wireless card, without flaw, and I didn't have to do any tweaking to get it to work, (just had to fill in my network information under the Network manager to get a working internet connection, right out of the box). Wireless support is essential for me, so I have to hand it to Ubuntu for giving me the best experience in this category.

Installation of the nVidia driver is incredibly easy on both distros, although Ubuntu has a slight upper hand with its "Restricted Drivers Manager". Fedora 7 actually works best with a custom nVidia driver from the Livna repository (follow the link for more information).

Since reviewing a distribution without an internet connection is rather pointless, I went ahead and installed Fedora 7 on a virtual machine through VMware Workstation. All my "virtual" hardware was detected, and I finally had a working Internet connection.

Look and Feel:

Out of the box, Fedora 7 looks much, much better than Ubuntu. The "Flying High" theme is elegant and very appealing, unlike Ubuntu's dreadful "Human" theme. Both the KDE and Gnome desktop environments are available through the installer, and either one can be easily installed after the other. Fonts too look excellent.
For the greater part of the week, I have been using KDE as my primary desktop environment. KDE is great because it allows me to use my beloved SuperKaramba app for awesome desktop widgets! I never really like the default KDE look for any distro, no different for Fedora 7, so I made ample customizations to suite my taste.
As with any Linux distribution, customization is endless, so if you don't like something... CHANGE IT!

Package Management:

Before I used Fedora 7, I had a downright horrible opinion of RPM style package management, mainly attributed to horrible experiences with SUSE. But, after spending just a week with Fedora 7 and yum, my opinion has made a full turn in the other direction. Yum, together with Yumex and the Livna repository, made installation of packages incredibly simple. Never once did I experience RPM hell, even when installing rather obscure, or random apps from the Internet. I really must say that Fedora 7, contrary to my initial beliefs, has proved to be excellent in managing packages.

General Thoughts:

Working with Fedora 7 has been a great experience, rivaling that of Ubuntu. However, although this is an excellent distribution, I feel that there is nothing really special about Fedora. There isn't much that sets it apart from other distributions. It isn't really hard to setup, but it isn't quite as easy as Ubuntu, and once its set up, theres not much to do that I couldn't do with other distributions. Perhaps I have not dug deep enough into Fedora 7, or I may just not have enough know-how to tell when something is spectacular in a subtle way, so I may very well be wrong. Perhaps Fedora 7 shines in areas other than the Desktop (maybe its great for servers, or for corporate solutions), which I was not able to explore. Then again, maybe Fedora 7 is just a great blank slate for you to build an ultimate desktop install, just as you see fit, free from any obstructions. If you have anything to share about what makes Fedora 7 great for you, by all means do so (just comment)!

Recommended?

Sure, why not. Really, there is no reason that you shouldn't use Fedora 7, although there really isn't any reason you should. Setup is easy enough, and all packages are up to date, if not quite bleeding edge. Still, I really do urge you to give Fedora 7 a try, as I believe it holds great potential.

Rating:

Lets say I had to give Fedora 7 a rating in the form of a number 1-10 (1 being the lowest, 10 the highest). I would have to say that Fedora 7 is a 7. The only reason it lost points was because my wireless card, although detected, could not be configured or activated (which may very well be different for other people) and just because the distro lacked that special "something" that would make it really stand out. *Keep in mind this score is very subjective, and only reflects what I feel after using the distro for a week*

Here's a quick screen shot of my final Fedora 7 desktop:

Gutsy Feature Plan

Now that the set of feature goals planned for Ubuntu 7.10 ("Gutsy
Gibbon") has been largely finalised, it seems like an appropriate point
to announce the plan to the world.

While this is based on the approved blueprints for gutsy[0], which are
expected to be implemented in time, we do release according to a
time-based schedule[1] rather than a feature-based one. It is not
unusual for some planned features to be delayed to later releases;
happily it is also not unusual for our developers to introduce neat
features we weren't expecting either.

-> https://blueprints.launchpad.net/ubuntu/gutsy/
-> https://wiki.ubuntu.com/GutsyReleaseSchedule
This is shaping up to be another great release! The best features so far seem to be Xorg 7.3 and the newly merged Compiz and Beryl projects (compcomm/OpenCompositing) for a default window manager. Regardless of what gets done, there are some really good ideas on this post so read up!

Read more @ the Ubuntu mailing list (see links above for even more info).

Amarok 1.4.6 Released!

Simply put, Amarok is the best media player available for Linux. its team of developers has put much working in to the latest release, 1.4.6, which is now available for download! From the release announcement on the Amarok website:
Your very own Amarok team announces the immediate availability of the latest 1.4 series release, 1.4.6.
So, what's new?
  • Funky new icon set, featuring KDE4 Oxygen colors by Landy DeField; for 2.0 he will be working to ensure that Amarok has a complete Oxygen icon set.
  • Default database backend is a lot faster due to a new SQLite version.
  • A gigantic load of bug fixes, the main focus of this release.
  • Introducing rockbox support for iPod.
  • Performance tuning.
  • More wockas per square inch.
  • A miracle in software engineering - we added less people to an early software project and made it later, disproving the Mythical Man-Month.
  • Packaged with FUKITOL.
Looks like another superb release with many improvements on an already magnificent piece of open source software!

Read more @ the Amarok website. Downloads for multiple distributions can be found here.

Lets all give the Amarok Team a big hand for creating one of the best media players in the world! ::claps::

Google Desktop for Linux!

Google has finally released a long-awaited native Linux application: Google Desktop for Linux. As with the already shipping OS X and Windows versions, Google Desktop enables Linux users to search for text inside documents, local email messages, their Web history, and their Gmail accounts.

This first beta version doesn't offer the sidebar and gadgets, which are found in other versions of the application. Those will come later, according to a Google representative, who stated, "We focused most of our efforts on desktop search. Gadgets and sidebar are not supported, but will probably be added in the future."
Well, no pretty sidebar with widgets yet, but that is soon to come! It is great to know that Google is making such an effort to bring their excellent products to all platforms! Download Google Desktop for Linux at the Google Desktop download page. Read more @ DesktopLinux.com.

Security Concerns In Linux

Part of the reason I am switching to Linux is, from what I have been told, it is superior to Microsoft Windows in the area of security. I think the reason is at least twofold.

1. People are out to get Microsoft i.e. trojans, viruses, and spyware. Of course when considering the effects of micro-evolution, this only means one thing for Microsoft: it WILL become a better OS. It is inevitable; if Microsoft wants to continue to be a viable secure OS for home and especially business use, Windows will have to continue to improve (evolve) or it will fail. Failure does not make money, therefore Microsoft will spend money to make a better piece of software, bottom line. Moreover, the reverse implication to Linux is true. People are NOT out to get Linux. There are no trojans, viruses and spyware to speak of in the Linux world.

2. The second reason why I believe Linux is superior to Microsoft Windows in the area of security is due to two things. First, there are so many distributions available it makes it difficult for someone with malicious intent to target a large populace because the user base is distributed over different types of Linux OSs. Second, Linux is Open Source. You would have to have many (many) people involved, from different backgrounds, cultures, values, countries, languages to "hide" a security hole in Linux. Even the most paranoid conspiracy theorist would have a hard time developing a theory about "those behind the Linux MACHINE".

As I contemplated these strengths in Linux, I realized something: these strengths are due to the environment in which Linux exists and not something that is necessarily inherent in the actual operating system itself. In other words, if the situation was reversed, if Linux was the major operating system everyone was after, would it stand up to the malicious users as well as Microsoft Windows? I think this is a question worth a serious answer. This is a question to which that I cannot even venture a guess, since I am still brand new to Linux. (Anyone... Anyone... Bueller... Bueller...)

Some would argue that the built-in firewall should be examined when comparing any Linux distribution to Microsoft Windows. I would agree that the default firewall in Microsoft Windows is a poor excuse for a firewall compared to IPTables, BUT third-party firewalls, from what I see, are BETTER than IPTables. Here is why.

Application Control.

I am a user of Outpost Firewall. It is a 3rd-party software firewall developed specifically for Microsoft Windows. I am not here to pitch this software, but I believe in it; that's why I bought it. When you install Outpost, there are few ways you can set it up and I configured it in the most paranoid way possible. ;P This piece of software monitors ALL the network activity coming from my computer, and it allows NOTHING to access even my router unless I say OK. There are automatic settings but I configure everything manually. I can even block Outpost itself from accessing the internet (which does not effect its operation except for updates). I keep Windows XP Pro locked down pretty tight. SVCHOST does not report back to Microsoft because I locked it down to only talk to my router and deal with my DNS. (BTW, if you didn't know, Microsoft has been taking "anonymous" stats from your computer since your first installation of XP.)

All that said, I want my application control on Linux. I will be honest; I do not trust anyone I do not know personally. I like Ubuntu, and from what I can tell the organization is an honorable group. However, I do not know the internal workings of the company, and because of my lack of knowledge, I would prefer to have a little MORE knowledge of what my OS is doing. Things like: when it accesses the internet, why it does, how it does, the duration of the contact, so on and so forth.

I am still learning what IPTables can do. Perhaps packet filtering in the hands of a knowledgeable person would put my application control-based firewall to shame. But I don't know. I like that I can watch what my computer is doing through Outpost. Honesty, Outpost is the ONLY reason I still use my Windows partition. (Well, that and the multitude of games I have.) Maybe someone who reads this article could point me in the right direction. I have read up on IPTables to a degree, tried Firestarter and Guarddog, but in the end uninstalled them. I'm happy behind my stealthy Linksys router without any firewall configured, for now.

Malawian teenage Windmill maker dominates TED Talks


Ory Okolloh refers to William Kamkwamba at TEDGlobal2007At TEDGlobal 2007, participants were privileged to listen to great speakers give their well prepared talks in Arusha, Tanzania. And indeed, you'll agree with me when the videos get uploaded on web that they were great talks.

The sessions included both 18 minute talks and short 3 minute presentations by a diverse group of speakers including business leaders, artists, activists, engineers, inventors and musicians.

Among these was a 3 minute Question and Answer session where the curator Chris Anderson asked William Kamkwamba questions regarding the Windmill he created for his home in Kasungu, Malawi at the age of 14. Through the Questions with photos on the slides, William told his story which made people shed tears and later, give him a big applause and a standing ovation.

In the sessions that followed, William's story became the most cited talk among the talks at this conference. Speakers like Ory Okolloh of Kenyan Pundit, journalist Dele Olojede and Noah A. Samara of WorldSpace referred to William Kamkwamba's tale of invention in their sessions.

The picture above shows Ory Okolloh who included the photo of William's QnA session with Chris Anderson from the previous day in her slides.

Look out for the TED Talk videos from TEDGlobal 2007 when they're uploaded on TED's website.

Are we ready for shockbots?

Last week, iRobot - maker of the Roomba robot vacuum cleaner - announced that it was working with stungun maker Taser to mount the latter’s controversial 'neuromuscular incapacitation' weapons on iRobot’s military droid: the PackBot.

The Packbot is already used in Iraq and Afghanistan to defuse roadside bombs and recently these robots have been fitted with lethal weapons like machine guns and shotguns. But, until now, weaponised robots have been for military use only. The iRobot/Taser collaboration changes this as it is geared towards making robots capable of shocking people available to law enforcement as well as the military.

Taser says the technology will let officers use a robot to 'engage, incapacitate, and control dangerous suspects without exposing those personnel, the suspect, or bystanders to unnecessary risks.'

But I spoke to Neil Davison, head of non-lethal weapons research in the peace studies department at Bradford University in the UK, and he sees some potential risks.

'The victim would have to receive shocks for longer, or repeatedly, to give police time to reach the scene and restrain them, which carries greater risk to their health,' he said. 'All you are really doing is further removing the process of human interaction,"

Then there’s the possibility that such robots could someday be autonomous, decided for themselves whether a target represents a threat. It might seem far fetched, but it's something iRobot has in mind, and its a possibility that has some researchers worried.

"If someone is severely punished by an autonomous robot, who are you going to take to the tribunal? The robot won't talk," says Steve Wright of Leeds Metropolitan University in the UK. "

I’m wondering what kind of smart anti-shockbot technologies the likes of G8 protestors might come up with? Any ideas?

Real-world transformers

The new Transformers movie is science fiction at its most fantastic, featuring fighting robots that shift shape to fight each other, for example, transforming from an aircraft to a ground vehicle.

Meanwhile, the Pentagon has set it sights on developing machines with similar capabilities, although they are clearly still a few decades away from matching Optimus Prime.

The military already has uncrewed air vehicles (UAVs) like the Predator, and
uncrewed ground vehicles (UGVs) such as Talon, both of which have been used extensively in Iraq. Now they'd like something that can combine the abilities of both.

This document describes a contract for "Transforming [aerial] vehicles that land, transforming into UGVs capable of inspecting caves and/or buildings to find people" awarded to Thorpe Seeop, Arizona, US, by the US Army (scroll down to "Small Scale Unmanned Air Vehicle").

The company's starting point was the company's Spinwing UAV, an unusual craft that transforms mid-flight from an airplane into a helicopter.

This idea was the taken further by Brian Yamauchi of iRobot (the company behind a widely-used ground vehicle called PackBot). Yamauchi came up with Griffon, a hybrid designed to "combine the speed and range of a UAV with the precise ground mobility of a UGV."

Griffon has a powered parafoil wing that attaches to the PackBot chassis. Software enhancements include, "semi-autonomous launch and landing software [that] will assist the operator in transitioning from ground to air modes and back."

The Griffon prototype took off with an 11-metre parafoil wing and, although it could hardly be described as graceful, the concept proved viable as this video shows. Things did not always go quite so smoothly, however, as another clip reveals. Oops.

Interestingly, the US Air Force Research Laboratory has its own plans. Fred Davis, of the Assessment and Demonstrations Division recently told me that they are looking at small UAVs that can morph or otherwise change their shape, and switch between different modes. These might include flying, perching and hopping or crawling. This would extend their endurance, as well as allowing them to go into buildings.

Morphing wings have been investigated for some time, but the new craft would be more capable, for example, with wings which converted to legs. "We call them Transformers," said Davis.

Open Source and Free Software

Every once in a while, somebody suggests that the only sane thing to do with Delphi is to open-source it. Other times someone says that we should all be using Linux because it is "free". And then there are those luminaries that suggest that it is "rude" to offer any criticism of a piece of software that is free. Finally, how many times have you seen somebody end an evaluation of a piece of software by saying "Best of all it is free!"?

There seems to be a lot of mythology built up around open source and freeware, and most of this mythology reflects poorly on the rational abilities of a great number of programmers. Don't get me wrong, in the paragraphs that follow I will have a few apparently harsh things to say about open source and free software, but I am not "rabidly opposed" to the use of open source and free software, as one misguided person recently opined in a newsgroup. I believe in being realistic about the software one uses and creates, so I have little patience with religious issues in software. Microsoft versus Borland? Java versus .NET? I think you can take such artificial dichotomies and stick them up your--Hang on Jake! This is a family forum!

Before continuing, it might perhaps be prudent to mention that I use open source and free software quite extensively in my newsreader project, as well as several commercial libraries and packages. I am a contributor and member of the core team of one of the most successful open source projects in the Delphi world. So if you think you are about to read the ravings of a religious fanatic, you need to realign your perspective. I am interested in debunking a couple of myths that surround open source software and get programmers into a lot of trouble with their PHB's, who for all their foibles and foolishness seem less retarded on these types of issues than the programmers that mock them behind their back. So let's look at the myths by which many programmers are hoisting themselves upon their own petards...

1) "Free software costs less than commercial software, afterall it is free. What is cheaper than free?" There is no such thing as a free lunch, and this really goes double for software. The true cost of using software is the value of ALL the resources used up in order to use it.

This involves time, as well as additional costs intrinsic to the use of the software itself. Suppose a piece of freeware takes you 5 hours to learn. If you bill at $50 an hour, that software has just cost somebody (either you or your client) $250 right off the bat. Poor or non-existent documentation is the largest cause of a high learning curve, and open source is notoriously poor in this regard. Look at the most successful open source projects and even they seem to have minimal documentation or unintuitive designs that increase the learning curve.

In addition the value of time, there is also the other costs of change. Suppose a firm decides to drop Windows XP and use Linux instead, a dream come true for many techies hiding in the server rooms of many a firm. They now have to retrain all their employees to use the new operating system and the different applications that run on that operating system, and endure the temporarily lower productivity that plagues every learning curve. For even a moderately sized firm, this cost alone is prohibitive.

2) "If XXX company open sources product YYY, then very quickly there will be no serious bugs because the whole universe will be looking at the code and finding and fixing the bugs." I've seen this argument made on behalf of open source several times and I have to laugh every time I see it. We have both empirical fact to disprove this overly-optimistic statement, as well as reason and rationality.

The empirical fact is obvious. There are plenty of open source projects that have bugs, some of them quite serious. One of the most notorious for years had been Netscape incarnation of Mozilla. Of course, Lazarus has been trying to rise from the grave for several years now, to save the Delphi and Object Pascal world from it's Original Sin. (Hey, I can mix my metaphors more than I mix drinks!) Back when I took Linux seriously I installed several incarnations of that operating system. Despite the assurances by enthusiasts that Linux was impossible to crash, I managed to hang Linux several times. (The problem there wasn't that Linux had some bugs, it was the outright lie from nearly everyone that it had none.) That was the part of my life where I decided to reject religion in programming and software matters. Aside from bugs, we also have the example of things like the Turbopower libraries that, except for Abbrevia, fell stillborn from the Turbopower without any further development despite the excited statements from many a programmer that Turbopower was doing a great and magnificent thing for all mankind by open-sourcing their libraries, that the community could now take it upon itself to continue these libraries from the number one third-party in the Delphi community. If the "whole universe" was looking at and fixing the bugs in the TP libraries, it is sure news to me. Apparently what happened was that everybody thought everybody else was going to donate their time and effort so they could get free software. Only Abbrevia has had ANY development since then, and that is solely do to the unique efforts of one unique individual, Robert "Look at this shirt I found under Brion's chair!" Love.

The reason and rationality is less obvious but just as potentially provocative. Human beings are not altruistic by nature, despite the claptrap of socialist intellectuals to the contrary. The only reasons why people are going to work on software is because they expect to get something back: either money, fame, or the enjoyment that comes from programming for it's own sake. Indy is a good example of an open source project that proves this point. Most of the principle contributors to Indy use it in their jobs or they sell products based on it. They have a genuine financial interest in the continuation of Indy. Others in the project have less of a financial interest in Indy and more of a passion for programming. They get a lot of enjoyment out of programming itself, and they find Indy to be intellectually rewarding. This latter type is rare in any project. But, the same problems that afflict most open source projects also afflict Indy. The biggest problem is that creating user manuals and demos is not as exciting as programming itself, and so Indy is perpetually under-documented and under-demoed. In fact, right now the biggest thing holding Indy 10 from a full bonafide release is the problem of documentation and demoes. Most open source projects seem to die in the final 20% of the project, that region infested by writing manuals, demos and debugging those final hard to fix but doggedly persistent bugs, and doing mind-numbing QA. Indy 10 isn't dead, and it will make it to release eventually, but it is obvious that if it were a commercial product, the docs and demos would most likely be done by now.

Another example of how the lack of altruism affects freeware is Xananews. Written while the author, Colin Wilson, had just gotten divorced, it matured into a relatively decent newsreader. Colin makes the source available and anyone that wants to add to it is perfectly free to do so. Yet, almost nothing is contributed by others, despite the fact that Xananews is most popular among Delphi programmers. Now that Colin is about to become very busy in his personal life, it is likely that Xananews will remain relatively unchanged for a while.

3) There is another aspect of free software that is problematic, and that is the negative effect on commercial software and programmer salaries. It is the height of irony that many of the same programmers that complain that overseas programmers are undercutting their salaries then spend their nights and weekends giving away their skills, even for software that undercuts the sales of commercial software that pays programmer salaries. As I see it, this is a problem for end-user software, but not necessarily for intermediate libraries and software development tools. It is arguable that by providing free software libraries one bunch of programmers is enabling another to affordably write software that could pay the bills.

Delphi 2006 Pro: A Review

Delphi 2006 is now shipping and several of the Delphi faithful, including your humble narrator, have installed and used Delphi 2006, aka BDS 4.0. The Borland Development Studio, as Delphi is now called, supports Delphi, C#, C++, Visual Basic and markup languages in varying degrees of completeness. The Delphi language has the widest support, covering Win32, .NET and Web Services. Visual Basic is afforded the least support, as there is no Win32 or designer support for the Visual Basic Language. This release marks a milestone for Delphi, a release that definitively and unambiguously says that “Borland is Back!”, to quote an ill-fated advertising campaign of a few years back.

Sometimes I think Borland should change its name to Pheonix, because of its remarkable ability to repeatedly rise again from the ashes of its own funeral pyre. As it was losing revenues and market share in the C++ Windows market, Delphi came along to fill in the gap. When Delphi sales were slagging to near fatal levels, JBuilder stepped into the arena to save the day. Now that the Java IDE market has been utterly Eclipsed, it falls to the Borland Development Studio and ALM to now save the day. Is this latest release up to the challenge?

Probably, but only time will tell. Despite the foolish conceits of fortune tellers and hedge fund managers, the future is a dark opaque mist totally concealing the fate of men. What I can tell you is what I’ve concluded as I use Delphi 2006 Pro for development on a Win32 desktop application I’ve been working on as a hobby in my spare time. This means this review is very slanted, as it only considers those things of interest to someone using D2006 Pro to write a Win32 application in the Delphi language. There is no ECO or other high level architect features reviewed in this blog.

My verdict on Delphi 2006 Pro? I think this is the best release of Delphi I have ever seen. Here are some of the reasons for this conclusion, which together collectively support my claim, I think:

1) Code compiled with D2006 is more robust and up to 40% faster than code compiled with Delphi 7. Thanks to the incorporation of several improvements authored by the guys from the FastCode project, the memory manager and several of the RTL routines are much faster than in prior versions of Delphi. This is especially true for multi-threaded applications, where the stock Delphi memory manager had been rather weak. By replacing their previous memory manager with FastMM, the guys at Borland were able to speed up the IDE, and by compiling the FastMM memory manager into compiled applications the savings are also passed on to you. Apps written with this new memory manager are faster, they scale and multi-thread much better, load more quickly, and suffer far less memory fragmentation over time. In short, applications compiled with D2006 will be faster and more robust than if compiled with Delphi 7, without needing anything other than a recompile. (eg. http://groups.google.com/group/borland.public.delphi.thirdpartytools.general/msg/cee80e04980a75f4 ) In addition, Delphi 2006, like Delphi 2005, allows you to speed up your Delphi code by inlining functions, like you can do in C++.

2) The IDE is robust. Lately, Delphi releases were rushed and perpetually unfinished. Delphi 8 was both incomplete and fragile, and Delphi 2005 took a couple of updates to really become usable as a development tool. Delphi 2006 has proven to be quite stable right out of the gate. Dissenters on the newsgroups are usually either blaming the IDE for bugs in their own code, or bugs in third-party code, or bugs in hardware drivers ( eg. http://groups.google.com/group/borland.public.delphi.non-technical/msg/761289bfe87080bc which was eventually followed by this mea culpa: http://groups.google.com/group/borland.public.delphi.non-technical/msg/929db2e509432df1 ), or they have found some of few genuine bugs, which so far seem to almost always have a workaround.

3) The integrated debugger is improved. One of the benefits of switching to the FastMM memory manager is that FastMM keeps track of and reports on memory overwrites and similar errors ( for example, I recently got this message logged to my event log: Debug Output: HEAP: Free Heap block 1b7110 modified at 1bae6c after it was freed Process JSN.exe (2248) ). In addition, the debugger now shows you your data structures in tree form at breakpoints, and lets you see all the local variables at each level of the stack trace. Debug browsing hints are fast and responsive. You can now detach from processes, not just attach to them. In addition, you can “Set Next Statement”, which allows you to move the instruction pointer forward or backward to any active line, during integrated debugging. This can be quite useful if you want to stop, change a variable and then run a section of code again with the different value.

4) The IDE now has live code templates. Code templates are scriptable macros, and Delphi 2006 comes with a few dozen right out of the box. They are saved in XML format for easy editing and copying. Apparently someone at Borland has spent a great deal of time designing and fleshing out this feature, which greatly surpasses all the third-party macro tools you could find before. There is apparently a lot of power built into the scripting available for these templates, as exemplified by the template that completes a case statement by enumerating all the values of an enumeration variable as case values. As news of this feature gets around, we should expect a nice third-party virtual bazaar to materialize where people share their templates. Those of you that still yearn for tools like CodeRush to return to the Delphi realm should take note of this feature. Properly used, this feature could greatly improve programmer productivity, once you get used to it.

5) Delphi 2006 has extensive Refactoring and extended Find functionality. Though these both were introduced in D2005, they are both much more robust and complete in D2006, though not perfect. The ability to find and possibly change all instances of a particular symbol or method is invaluable, especially when it spans all units in a project. (While tools like Grep let you search for strings, they do nothing to keep from matching same-named symbols and methods in other classes.) The number of refactorings available in D2006 is surprisingly high and complete, though a few of the more elaborate ones depend on the integrated Together functionality, which is still a bit slow on all but the very fastest machines. The refactoring in D2006 almost rivals that provided by add-ins like CodeExplorer, though I still find myself using CodeExplorer on a continual basis.

6) Error Insight is one of my favorite features. It is like the grammar checking in Word, except for code. Once I figured out how to keep it from erroneously marking correct code (see http://groups.google.com/group/borland.public.delphi.ide.general/msg/ca0f361e9f8c2b64 ), I found this to be very useful. It is one of those things that becomes such an integral part of your coding that you only realize it when you have to descend to a lower version of Delphi for a client.

Well, that is my list for now. There are a lot of other features and nice things about Delphi 2006 that I did not touch on, but these are the six things that I like the best about Delphi 2006. Personally, I think there is no reason for a Delphi enthusiast not to embrace this release. Borland has achieved something that has eluded them for several years, a Delphi release that is solid and useful, all at the same time.

C#, .NET, VB.NET, Vista, etc. So What is the Point?


What great user requirements are being met by the software and tech world being so insistent on constant change? Were there scads of users of software that were unable to get their tasks done on Win32 machines using native Win32 code? I don't recall having come across a single user that said, "I can't get anything done on computers, until they implement software that needs a separate 20MB download in order to work, or a total reinstallation of a new operating system, or written using a 'new' language that is just a rehashing of old language elements that were already found in C++, Object Pascal or Java."

Maybe if the software development world were smarter they would spend time perfecting what they have, instead of cranking out more complicated buggy crap that nobody needed in the first place. Given the speed with which software developers leap headlong into supposedly "new" technologies and languages, you'd think they had already mastered pre-existing technologies. Wake up call: most existing software is crap. And not just crap, but really bad, incredibly aromatic, pungent, decadent befouling crap that oozes out into the world to putrefy anything it touches. Those who dare to release such stuff into the world have no right to arrogantly demand that the world wait while they rewrite their crap in the latest and greatest "new" language or framework, or produce even fouler crap while they start from scratch learning another language or framework in which to write crap.

If this keeps up, at some point the rest of the world is going to say to hell with us, and they will stop upgrading. Already, many businesses have adopted just such a stance, insisting on running their Windows 95 boxes until they can't run anymore. Yes, you read that right. There ARE people that ran pre-2000 software right through the supposed Y2K Armageddon without anything happening to them. They learned as a result that the IT world is full of crap about this need to continually upgrade.

In fact, I will go so far as to say that this continual upgrading is one of the biggest rip-offs in the history of mankind. And here is the part that almost every software developer reading this will hate: if you write new software to the latest technology to come out of Redmond or Silicon Valley instead of fixing pre-existing software, you are an accessory to this crime. Why? Because it is the forced obsolescence of software that forces the rest of the world into this upgrade debacle.

I'll give you a perfect example, in fact it is the situation that got me annoyed enough to write this post. For years I've been successfully using Quicken to manage my budget and pay my bills online. It's been working fine for this purpose and there is no technical reason why it could not continue to fulfill this role for years to come, except for one thing. Even though I paid for it, and was not told at the time that it was time-limited software, it will now expire in April. Not because the hardware on which it depends is now obsolete or non-operative. Not because of any upgrade to my PC, or the operating system on which it depends. It is solely because the company that sells Quicken wants to sell me and all other users a new copy. This is amazing, but perfectly par for the course in the software world, where instantaneous obsolescence is such a part of the culture that it is now seen as a sellers' right.

I've been thinking a lot lately about my newsreader. This is a software project I've been working on over weekends and nights for a few years. Lately I've had to admit to myself that it is a hobby, because quite frankly, I don't think enough people would pay money for a newsreader for it to be worth the effort it takes to turn a functioning piece of software into a finished product that you can sell. Don't get me wrong. This newsreader works very well. It is intelligently multi-threaded, fast, and very solid. But the icons are just what I could find for free, several screens don't have that professional polish or consistency, some messages are not user friendly, etc. And the user-feedback from the worker threads is not as complete or informative as I'd like it to be. In addition, I never did write a help file or manual for it, nor choose a real product name. And so on.

The newsreader was written in Delphi, and it now comprises a lot of units and when all third-party components are included, over a million lines of code in a full build. This is a significant code base. If my goal was still a product that I would sell, I would keep it in Delphi and finish it up, no doubt about it. D2006 is by far and away the best tool to use for developing Windows software for 32-bit Windows. But the time I use to finish it up in Win32 does me little good when I go looking for a job. Most current Delphi jobs are either end-of-life maintenance, or conversions to C#, neither of which portends much promise for long-term Delphi usage. So I've been trying to figure out what language to learn/review in order to keep the paychecks coming in over the long-term. I've had constant employment doing Delphi only, since 2001, and Delphi mixed with C++ before that. But the future of Delphi jobs is uncertain. I need to find an alternative language, and perhaps write a significant piece of software in that language to prove it is not just a checkmark on a resume. I'm looking, therefore, to rewrite the newsreader in either C++ or C#. The advantage of C++/CLI would be that I can use native code for performance-sensitive activities and managed code for the GUI and other parts. I could then directly reuse lots of existing code samples in C++ that are so easy to find online. I was doing C++ for years before Delphi came out, and for several years since as well. It probably would not be too difficult to get back into it. Plus, with my background in economics and my work experience in the financial industry, I'd be well-placed to get a good job in the financial sector in Chicago, which is primarily C++ and Java country. On the other hand, a combination of Delphi and C# skills may be just the ticket to ride, what with all the Delphi-to-C# conversions currently under way, or destined to occur over the next few years as existing code bases drift into perceived obsolescence. I've not yet decided, so I've been looking into what it would take to write a newsreader in .NET 2.0 using C++ or C#.

Well, here is where it gets depressing. .NET 2.0 is definitely a step backward from Delphi when it comes to the extent and quality of the components that come in the box. A lot of .NET components don't have user-draw, and there are no .NET equivalents for several useful VCL components. In addition, there are literally thousands of useful free components for Delphi. No such thing exists for .NET. So, I either have to spend a lot of money, or I have to make do with the components that come with .NET 2.0. In addition, there is not a single feature that a newsreader needs that is not possible to write in native Win32 Delphi code, and .NET 2.0 does not offer anything in the user experience that was not just as easy or even easier to write using Win32 Delphi. Sure, a newsreader that I write in .NET 2.0 will probably eventually be better than the current Win32 version, simply because I've learned a few very useful things along the way about how to architect a good multi-threaded app. But it is inevitable that the first few versions will be worse, because they will not have yet been fully debugged through several iterations like the current Win32 version is. If I had end users, I would be doing them a huge disservice, arguably almost to the point of fraud, if I rewrote the app in .NET 2.0, because of the inevitable bugs that are a part of any software construction project. The first few versions of a complete rewrite can never be as solid as the next iteration of a previous architecture.

Read that last paragraph again. Think about the full implications for all the rewrites going on across the software development world in reaction to the introduction of .NET, and the massive disservice thereby being done to end-users. The first few .NET versions of software will be WORSE than the existing Win32 versions, even though they may be architecturally superior, because of the inevitable bugs. It is inevitable and anyone that doesn't think so is fooling themselves.

In conclusion, I will once again pose the question in the subject line. .NET and C# are nice and all, but what is the point? Are they really solving anything?