Virtualization Concerns

In two recent posts (Sustaining the Unsustainable, Towards a New Paradigm) I laid out the case for a new networking paradigm based on ‘virtualizing’ as much of our technology infrastructure as possible.

As I pointed out previously, a simplistic explanation of ‘virtualization’ is that we remove the software applications that now reside on individual hard drives and install them on centralized file servers. When a student or teacher uses a piece of software, it is not running on their individual workstation, it is running on a file server; thus their workstation is a ‘virtual’ one.

There are many benefits to ‘virtualizing’ workstations (see Towards a New Paradigm); but today I wanted to focus on the challenges that we face when setting out to ‘virtualize’ our networks.

1. More servers…
In this new paradigm, we run our software on servers and not on individual hard drives, thus we need more servers. In the current environment the file server is primarily a storage device. One server can service more than 150 workstations. In our new environment we may assign a server to every 30-50 workstations. Because these servers are actually using processing power and RAM to run educational software applications, we want to be careful not to oversubscribe them because doing so will affect application performance. Using a 500 workstation environment as an example, might require approximately (10) application servers.

2. Reliable networking infrastructure…
In the current environment most of the action takes place at the local workstation, and other than Internet use, the network itself is primarily used to store or retrieve files. In the new environment the network is used constantly because the software running on the servers is communicating with the local workstation. In fact, every mouse click and keyboard stroke is sent over the network. In order to create a seamless experience for the user, the network needs to be sound and reliable. The greater the network speeds the better. Typically, we’re talking about 100mbs to the desktop and a gig backbone.

3. High Speed and reliable WAN infrastructure…

If we decide to gather the application file servers into a centralized server farm, then the Wide Area Network needs to be robust and reliable.

Heidi Has Gable comments:

“In a virtualized world, you rely heavily on your WAN connection. Now, if your Internet is slow or even down, you can’t do anything with the computers! If your apps are running locally, at least you could work on the local machine until the Internet circuit gets fixed! I could write in Word, do mind-maps in Inspiration, etc…”

What Heidi observes is partially correct. In a virtualized world when your connection to the server farm is gone so is your ability to use the software that is on the servers. This has nothing to do with the Internet connection. If the Internet is down, you are still able to work, as long as your local connection to the server farm is up.

Some schools mitigate the chances of losing their connection to the application servers by abandoning the “server farm” and deploying their application servers locally in the buildings with the workstations they are serving. By eliminating the WAN they are eliminating a potential point of failure. However, there is no doubt, the most efficient and cost effective deployment is to locate all application servers in a single server farm located centrally somewhere in the district.

Another strategy to combat the worst-case scenario of losing the connection to the application server farm is to put a single application server in each building as an emergency back up. 98% of the time the workstations in the building are using the ‘application server farm’ ; but if the connection to the ‘farm’ goes down, the school building can temporarily log on to the local application server and work until the ‘server farm’ is back online again.

4. Hybrid environments…
One of the most frequently asked question is how does a ‘virtualized’ network handle large multimedia files, Photo Shop, AutoCad, movie making, digital story telling, and high end Adobe applications?

The key to remember is that we can create hybrid environments. Virtualization is not an ‘all or nothing’ proposition. If we have labs or workstations that do ‘high-end’, processor heavy applications, then it makes sense to load and run that specific software locally as we do today.

But a great advantage of software ‘virtualization’ is that even if the PhotoShop software is stored and run locally, we can also install a number of copies of the software on the application servers in the ‘farm’, so that when students go home they can continue to work on the application when they log in remotely.

Hybrid environments can also apply to the hardware employed in the virtualization paradigm. It’s not uncommon for folks to equate virtualization with thin client technology because it lends itself so well to a diskless hardware environment; but realistically, the most common and effective method of virtualizing a network infrastructure is to create a hybrid of older, existing computers with hard drives, thin clients where they make sense, laptops and smaller devices like the Asus Eee.

I will examine several other concerns and the costs of virtualization in a future post.

Asus unveils a 9″ Eee PC

Asus has just unveiled it’s successor to the 7″ Eee pc which in the past couple of months? has been? met with considerable success? in schools? for their? 1to1 programs.? This new 9″? was showcased? in Germany at CeBit, a? release date? is expected mid-2008 and is to offer a larger storage capacity? with a 12GB SSD? and an? option for? a smaller 8 GB.

Asus 9

The Eee PC 900 (A name many are calling it) will feature a 9″ display, a higher resolution at? 1,024 x 600 pixels as compared to the 800 x 480 pixels of the existing 7″ version. The battery life will also be the same as the 7″ at around 2.5 to 3 hours.

As of now,? pricing? is unavailable but this does offer a great option for many older students and? teachers looking for? the ideal device.? ?