Home Classroom


Linux: The Revolution OS

On 25 August 1991, a Finnish student named Linus Torvalds posted a message in an Internet newsgroup about a hobby software project he was doing. He had no idea how wrong he was regarding the content of that message. Linus’s particular message started with a disclaimer and went on confirming about the hobby nature of his project.

“I’m doing a (free) operating system (just a hobby, won’t be big and professional like gnu) for 386(486) AT clones…

…and it probably never will support anything other than AT-harddisks, as that’s all I have :-[i]

Then 22 year old Linus, honestly believed that his project would be a fun project and just that. He was considering the name Freax for his OperatingSystem (OS). However the person responsible for allocating storage for Linus’sfiles didn’t like the name and created the space under the name Linux [ii]. Today, roughly 18 years later, we all know and call that piece of software as Linux. Linus was utterly wrong about his predictions about his own OS.

Image1:Linus Toriyalds  

Image 2: A Typical Modern Linux Desktop
Image 3: Tux – The Linux Mascot

Today Linux powers computers ranging from our common desktop and laptop PCs to the tiniest embedded computers to the fastest of Super Computers. In fact as of this writing, the worlds fastest Super Computer, named Roadrunner runs Linux. This article you are reading, is being written on a laptop computer running Linux. It is a common knowledge that Linux is the OS which runs on the most diverse collection of platforms including mobile phones, cameras, netbooks, gaming consoles, telecommunication devices, networking equipment, set-top boxes, web servers, et.


The Other OS, the free one

In short terms Linux has grown into one of the most widely used operating systems ever, and that is even without having a company controlling it. Linux is in fact an operating systems developed by people for people. It is one of the best examples can be given for Free and Open Source Software (FOSS). Which means, unlike most of the operating systems you might have used or heard, Linux gives you the full freedom with the software. This freedom includes the freedom to use, modify and redistribute, all which are 100% legal.

As you might already know, most of the software you download or copy freely from friends or buy cheaply from Unity Plaza are pirated copies and are illegal. However using, modifying, copying and redistributing FOSS such as Linux is perfectly legal. While Linux (or sometimes called as GNU/Linux) provides an alternative to proprietary operating systems, many other FOSS applications provides alternatives to different types of other proprietary software. Chances are you have already heard or experienced them. For example Firefox, Konqueror (web browsers), Evolution, Thunderbird (e mail), MPlayer, VLC, Amarok (media playback), Pidgin (IM), OpenOffice.org (office productivity) are some applications which are FOSS and available under Linux. More information regarding what FOSS is and their background shall be discussed in detail elsewhere in Digit.


Image 4: A Linux system with an E17 Desktop Environment
Image 5: A Linux system with a KDE Desktop Environment
Game Plan

After reading this far of this article, you might be wondering what to expect from this Linux series. Here is what we are going to do.

Linux itself is a powerful, secure and stable alternative to common desktop operating systems. There are thousands and thousands of users who use Linux as their desktop (day-to-day) OS, and the number is always growing. Starting from today in Digit, we shall initially focus on how to get you intimate with Linux basics, and allow you move around and get things done with a Linux system with comfort. Finally we hope to make you efficient with the art of mixing Graphical User Interface (GUI) and Command Line Interface (CLI)

The Linux basics series shall continue as tutorials. It would be much beneficial if you can practice what appears on Digit issues. The best way and the only way to properly learn Linux is to try it. So it is highly encouraged to try the examples and explore on your own. Let us now try to setup the work environment for the Linux sessions.


Image 6: A Linux system with a GNOME Desktop Environment
Image 7: MPlayer media player playing a video

A quick way to try Linux is to boot (start) your computer with a Live Disk. A live CD/DVD is a disk which can be used to boot a computer into a working environment. The hard disk is left untouched while you work within that live environment. When you are done, you can remove the disk from the drive and reboot (restart) the computer. However it should be noted that the changes you make to the live system are not persistent (i.e. the changes are lost when you reboot), thus it will give you a nice sandbox to play.

I recommend Fedora, Mint or Ubuntu live disks as they can be used for installations too. You can download the ISO images from the respective web sites and burn them. If you have speed, bandwidth or Internet connectivity problems, you can always borrow a disk from a friend and make a copy. It’s perfectly legal and even encouraged. If you still can access Internet, you can order a free Ubuntu CD from Ubuntu Shipit service. It is totally free and send a CD to your home. You will need a Launchpad account (free registration) for the ordering process.

If you wish to install Linux systems for long term use, you can find online guides for Fedora 9 / 10, Ubuntu 8.04 / 8.10 installations. If your Linux distribution version is different, try to follow the above links for general guidelines. For your convenience, please avoid installing outdated versions (Eg: Fedora version before 9, Ubuntu version before 8.04, Mint version before 5, All older Red Hat version like 8, 9).


In March

That is it for the introductory. I hope that you will have already looked into Linux systems and probably looking for more by the time Digit March issue comes out. I hope to discuss a bit of Linux history, why there are different version of Linux systems (Eg: Fedora, Ubuntu, Mint, SuSE, Debian, Gentoo, etc.), then log you into the system using GUI/CLI and try a few Linux commands.

[1] http://groups.google.co.uk/group/comp.os.minix/browse_thread/thread/76536d1fb451ac60/b813d52cbc5a044b

[2] Torvalds, Linus and David Diamond, Just for Fun: The Story of an Accidental Revolutionary, 2001, ISBN 0-06-662072-4

[3] http://www.top500.org/system/9707



In the world of IT convenience is a major factor that drives the progress. Content Management Systems (CMS) such as Drupal, Joomla, WordPress or MODx are prime examples for this. Throughout this article series we’d be looking at how to use the all purpose CMS-Drupal  to our advantage.


Logos from WordPress, Joomla and Drupal


By the definition CMS is a computer application used to create, edit and manage various kinds of digital medial and text. There are many variations with CMSs but some common characteristics such as login and user management facilities can be identified as well. Most of the times CMSs are created with the intention of a particular audience or a set of tasks. For example the main audience of WordPress is the crowd of blogsphere and it has become the de facto CMS for a simple (or may be not that simple with recent upgrades) blog now. But few CMSs such as Drupal and Joomla are designed in a way that suits all purpose and all audiences. Also CMSs can be futher categorized based on their commercial background. Drupal as well as Joomla and WordPress are several Open-Source content management systems that are currently in wide use. There are also many popular commercial CMS platforms such as ExpressionEngine that provide their service at a cost.

As most of successful open source projects, the vast community that’s around Drupal is what gives it the power to be a leading CMS in the open source arena. As of recent stats it has a member base of more than 350,000 and a developer base of more than 2000. According to Drupal download stats, within the year May 2007 to April 2008 alone it has recorded more 1.4 million downloads which is a 125% growth from year 2007. This exponential growth also means the rapid growth of the community as a whole. Among few notable sites that have used Drupal as its CMS platform are NASA, FedEx, MTV UK and Ubuntu(many other popular Drupal based sites are listed here at http://buytaert.net/tag/drupal-sites). Also Drupal won the awards for best PHP based CMS and Overall Open Source CMS which were the most tightly contested categories at CMS awards 2008.


Mementos from CMS Awards ’08

The Drupal project started as a message board by Dries Buytaert and became an open source project under GNU Public License (GPL). Dries wanted first to name the site as ‘Dorp’ which in Dutch means village which in turn meant to refer the community around it but later it was changed to Drupal because it sounded better. Drupal is an English rendering of the Dutch word ‘Druppel’ which gives the meaning ‘drop’ and the current Drupal logo is depicting it as a water drop.

So that now we have taken the first steps toward Drupal let’s get down to real work.

The ways of Drupal

Drupal is not just a Content Management System, but also a modeler frame work. The invincibility of Drupal that makes it a possible solution to all situations comes from the way it’s organized. This modular framework allows it to be extended and to add new features through modules or change the  complete look and feel through themes without a drop of sweat.

When we consider the structure of a Drupal site, we can break it into two major parts which are,

  1. Modules
  2. Themes

So now let’s skim through modules and themes of Drupal with the intent of diving deeply into them at a later time.


The power of modules

Most the functionality of Drupal is rendered through its modules set. There is a set of core modules that comes with Drupal that handle all the basic tasks of the CMS such as login and user management, error logging and other essential system tasks. Other than these core module, there’s also a set of optional core modules that comes with the installation for various tasks such as RSS aggregation, handling user profiles and managing user comments though unlike core modules they are not mandatory for running the system. For extra functionality users can download suitable modules freely from the vast modules repository at http://drupal.org/project/Modulesor create your own modules to achieve the desired functionality (which we will look thoroughly at a later time).

Hooks can be considered as the most distinctive characteristic of Drupal module architecture. The concept of hooks is mainly to allow modules to interact with the Drupal core. To extend Drupal, a module needs simply to implement the appropriate hook from the defined set of hooks (to get the full list of hooks for Drupal 5 see http://api.drupal.org/api/group/hooks/5). When Drupal wishes to allow intervention from modules, it determines which modules implement a hook and call that hook in all enabled modules that implement it.

Blocks are also another major component of modules that worthy taking a peek into. Even though it’s not mandatory for a module to have a block, most of modules that represent something visually have blocks. Blocks are separated, customizable areas of the web site that can be moved to various positions of the site with just a few clicks from the back end, which in turn makes it so unbelievably flexible and customizable. For example the login box in Drupal is a block that’s implemented in the core User module which provides basic login functionality. Even though by default it’s in the side bar of a page, with just a small change from the back end to the position of the block, it can be moved to any position of the page (or any where a region -a defined boundary in the site that a block can be displayed- is defined). With each module having the capability of creating as much as blocks necessary, it makes modules even more flexible and powerful.


Drupal module administration page


The beauty of Drupal themes

A Drupal ‘theme’ is a collection of files that makes the presentation layer of the site or in other words defines the “look and feel” of the site. It contains the underlying hierarchical page structure which decides the order of how a page should be properly overlapped and rendered, the the cascading style sheets (CSS) files that controls the presentation and even javascript files that are used for dynamic page content.

The best and most noticeable feature of a Drupal theme is it’s abstraction and hierarchical order. The order of a theme usually comes in the bottom-up incrementing way such that it can control the presentation of almost everything of the site. For example the default presentation of a button that’s defined in a block can be easily changed through overriding the appropriate default element template in your theme. Now if you want to change the look of this block, it can be done by defining a block template for that particular block. Then if you still need to change the look of all blocks as a whole you can override the default block template. But this hierarchy is not only limited to visible regions of the site such as blocks; it can also separately customize deferent content types of the site. Also you can define a template for the front page and another for repeating mundane middle pages. Now you might see what I meant when you can change almost everything of a site with the help of Drupal theming hierarchy.

As a side note. a basic knowledge of PHP could be a help for some tasks, but even without it you will still be able to do a good theme if your theme is not much complicated and does not override the default behaviour. Drupal comes with a few built in themes and for more themes you can use the repository at http://drupal.org/project/Themes to see any matching theme to your requirements or at least a close resemblance that can be customized without building from the scratch.



A word about versioning

A new user to Drupal can get bit dazzled by various Drupal versions out there. There are still 4 versions going around ranging from Drupal version 4 to 7. Even though it has been over a year since Drupal 6 has been released, Drupal 5 is still seems at the forefront with the compatibility of  modules. Further, as of now Drupa 4 is on it’s way to getting extinct while Drupal 7 is still too unstable and not supported by other modules for any practical purpose.

To get any of above Drupal versions use the URL http://drupal.org/project/Drupal+project.


Enough Chit-Chat

So now we know the basics and the background of Drupal let’s setup a Drupal site in your localhost. For this you only require a web server (apache, IIS .etc) with PHP enabled and also a Database management system such as MySQL or PostgreSQL installed.

Getting a Drupal site set up is easier than a piece of cake. Here are the simple steps that you have to follow.

  1. Decide which Drupal version you are going to use and download it from above address.
  2. Unzip the download package and put it in your web server document root.
  3. Create a new database (you can use a tool like phpMyAdmin or just terminal for this) and enter the database name and database logins to Drupal installation page
  4. Click next and Drupal will set up itself.
  5. Now you can goto the Drupal administrator account for the first time and add logins to your super administrator account and submit.
  6. That’s all, you are ready. It wasn’t hard, was it ?


Like to give a hand ?

Drupal community has created and maintained this wonderful and unbelievably useful CMS with their hard working. So while you are using this amazing CMS, do you feel like giving a hand ? You won’t have to be able to code; You can contribute by writing some documentation or testing and reporting few bugs in the plethora of modules available there. Also you can join live discussions at IRC #drupal, #drupal-themes, #drupal-dev or #drupal-support at Freenode for more deeper subjects. All your support will be enough to improve at least a tiny bit of the project which in turn will help many others in the long run.

He was desperate to find her when he was abandoned by the mother despite her love towards the boy due to social phenomena. Think for a minute. Am I describing a real scenario that happened between a human son and a mother or can you recall some story which had the same resemblance?  Yeah, you are right. I was just giving a short description of Steven Spielberg’s film “Artificial Intelligence”. So, David was actually a humanoid robot boy with emotions such as love as well as a certain level of intelligence fed in to his systems to be able to appear quite similar to a human boy. So what is Artificial Intelligence (AI)? It has been a question which was answered in many different ways based on the emphasis of the era that was being considered. First it was termed to describe machines which act as humans. There were other definitions such as machines ‘acting rationally’ (doing the right thing to suit the situation and problem in hand), ‘thinking rationally’ and ‘thinking humanly’.


History of Artificial Intelligence

The first research which can be considered as to be in-line with AI was done by Warren McCullouch and Walter Pitts in 1943 They proposed a model of neurons (as the ones which are in a human brain) in an artificial manner to represent neural properties. They also suggested that with the use of many such neurons combined as in a network could be used to model logical connectives such as AND, OR, NOT. Another important point that they suggested was that given the necessary data such neural networks could learn which is one of the early births of learning techniques to be suggested which later improved to neural network based learning in greater scale. In 1949 Donald Hebb was successful in introducing a simple updating rule so that the neural network connections which were used to connect each of the neurons to their neighboring neurons could be updated. It was known as “Hebbian Rule” which is even used in neural network learning at the simplest level nowadays. Princeton university graduate students, Marvin Minsky and Dean Edmonds started working on the neural network computer in 1951 which was called as SNARC. It is said that although the above mentioned research had resemblances of Artificial Intelligence, Alan Turing was the first to introduce the whole concept of AI with the an article named “Computing Machinery and Intelligence” in which he introduced the famous Turing test and other AI concepts such as Machine Learning, genetic algorithms and reinforcement learning to the world.

The coining of the name “Artificial Intelligence” was done at the Dartmouth conference held in 1956 which consisted of many US researchers of the era such as Allen Newell, Herbert Simon, John McCarthy and Marvin Minsky. It was proposed my John McCarthy to name the field of machines being able to simulate or act with intelligence as “Artificial Intelligence” which was since then called by that name irrespective of whether that term precisely depicts the area in concern.

The time period after the Dartmouth conference, many researches came up with computer programs to address AI aspects, within the limited computational power and tools available at that time. In 1957 Newell and Simon created a computer program called “General Problem Solver” which was intended to act as a universal problem solver for problems which can be formulated in to symbolic representations. But this was not able to handle real world problems other than defined problems such as chess games, theorem proving, towers of Hanoi, etc. Many such problem solving programs such as ‘Geometry Theorem Prover’ by Herbert Gelertner(1959), Program for playing checkers by Samuel(1952-1956) followed. LISP, a high level programming language specifically to cater to the AI domain was introduced by McCarthy in 1958 making a great breakthrough to the future of Artificial Intelligence. The next major aspect which was introduced was the concept of Knowledge-based systems. In the mean time, researches started thinking in the line of how humans gain intelligence through learning from the knowledge they gather. This gave rise to the development of Knowledge based systems, one of the first main such systems being the DENDRAL program by Buchanan and fellow researchers in 1969 in order to solve the problem of inferring molecular structures. Another system that was developed for diagnosing blood infections was called MYCIN and was quite successful in its task even sometimes better than human experts showing that the area of research is promising. AI based expert systems and knowledge based systems was used for industrial purposes as well for some time but it the systems lacked long term prospects. AI was developed and continued as a Science where new areas of research coming in to the scene. Neural networks were given more importance and research was carried out further since 1986. Speech recognition, Linguistics, Data mining, Machine learning, Pattern classification, clustering, and many more areas were interlinked to AI and was starting to boom.


An unbroken bond

AI is a vast subject where many other disciplines are intermingled. Mathematics, Logic, Medicine, Genetics, Philosophy, Economics, Psychology, Cognitive Sciences, Computer Science, Computer Engineering, Robotics and Linguistics are some of the other subject areas which foster the research in AI and also whom which get improved from AI’s contributions. For example, most of the AI programs are based on foundations in Mathematics and logic. Further areas such as medical diagnosis, robots for surgeries have been developed to assist the human medical practitioners to provide a better service to the patients. Projects such as identifying human genomes, DNA classifications have benefited from the AI related experiments such as pattern classifications, clustering, etc. This shows that AI has a strong bond between lots of other disciplines which make the subject having lots of research to be carried out in a vast number of lines. This makes the evolving of AI to be wider but slower compared to other scientific subjects. Nevertheless, as the various contributions in all these domains help make the life of mankind better and more easier, so that the bond among these interlinked subjects remain while researches find means of  strengthening it further in the future with many other findings.


Artificial Intelligence in practice

For a general person what would AI mean? He would not care whether it is an advanced interrelated subject or not. You and I should have some benefit for us to believe and identify the importance of AI. Therefore, let’s look at how AI has approached our lives in practice. Have you encountered when you are browsing the internet to buy something, some sites recommend you products to buy based on your past buying trends and likings? Do you know that you can even delegate the task of ordering your weekly grocery goods online to a computer agent? Yes, you can. All these are enabled due to the existence of intelligent software agent systems which can act rationally and perform the task given, which is a part of AI. Further, have you heard of robot pets that are used to behave like actual pets in giving total love and caring to the elderly people in Japan? There are medical testing devices, which enables the lab technicians as well as the medical practitioners to easily diagnose an illness of a patient and also prescribe required drugs or treatments accurately with the help of AI. Nowadays there are even programs which can automatically generate and compose music and programs and which can select music to play based on the rhythm of walking (enabled using wireless communication between your music player and a device placed in your shoe as you are walking).

For sure, you must have heard about the world renowned chess master Gary Kasparov being defeated against a chess game from his opponent “Deep Blue” an intelligent chess playing machine developed by IBM in 1997. Gaming is another area where AI related concepts could be applied to build computers to play different games that can play against human or other computer players effectively. The above are only a very few of the applications of AI which a general user would be able to witness as at resent.

There is yet more to tell, yet more to find, yet more to explore in this vast and exciting field of Artificial Intelligence. The specific details and technical aspects of the various areas in AI would be covered in the articles to follow in the months to come, giving you a better feeling of Artificial Intelligence and to create wonder, interest and enthusiasm in this field.