Understanding Hyper-V Drivers

One of the more challenging aspects of getting up to speed about Microsoft’s Hyper-V technology is that of understanding enlightenments and integration components (ICs).  The terms are new, as is the underlying technology.  A recent blog post from the MSDN Blogs site helps explain the concepts.  The post is titled Hyper-V: Integration Components and Enlightenments and does what it promises.  A few block diagrams and images would be helpful. 

Hyper-V is currently available in beta form and will be supported on Windows Server 2008 later this year, but it’s never too early to start learning about its architecture.  You can expect more information to become available as the product gets closer to its final stages.

Managing Multiple Monitors on Windows Vista

Among my many gripes about Windows Vista (see My Struggles with Windows Vista), is the lack of truly useful window management shortcuts.  Multiple monitor configurations are becoming increasingly common, and the Windows desktop simply hasn’t kept pace.  Sure, if I’m willing to click on numerous UI elements, I can reliably move a maximized window from one monitor to another and resize it to my liking.  My current setup includes a widescreen 22" LCD and a 19" LCD that’s rotated for a portrait view (it’s great for editing documents and reading web pages).  Overall, the common task of managing windows on multiple monitors shouldn’t be an ordeal.

Fortunately, there are several third-party software products (some free) which help make the process easier.  I have evaluated a couple of them and thought I’d mention my findings:

  • UltraMon is a commercial product that provides features for managing multiple monitors.  It allows you to span wallpapers across multiple disparate displays.  Most importantly (for me), it allows me to create simple keyboard shortcuts for moving and resizing windows between monitors.  It’s a bit pricey for the functionality, but it really does help.  Unfortunately, I started having some display driver issues with my Nvidia GeForce 8300 GS drivers after I installed the latest beta.  Hopefully a final release version will address that.
  • DisplayFusion: DisplayFusion looks like it was originally designed for managing wallpaper settings for multiple monitors.  However, it offers a simplified configuration UI that allows you to create hotkey shortcuts for moving and resizing windows.  Currently, this is my favorite as it hasn’t broken Windows Vista and you can’t beat the price (it’s free, but donations are accepted).  This one gets my recommendation, at least for now.
  • GoScreen: GoScreen is designed for use on Ultra-Mobile PCs (UMPCs), such as tablet computers or portables that have touchscreens.  It provides features for more easily managing windows.  I haven’t yet tested the product, but it does seem to have a large number of useful features

Overall, these utilities effectively fill some gaps in Windows Vista and make me far more productive (I also couldn’t turn down the shot at alliteration in this post’s title). 

Managing Virtualization with System Center Virtual Machine Manager

If you have used the Microsoft Virtual Server 2005 platform, there’s a good chance that you find its default web-based management tools to be lacking.  If you’re running one or a few virtualization host servers, the admin tools can certainly get the job done.  But what if you’re deploying dozens or hundreds of VMs every month.  In order to manage these systems, you’ll need to invest in some virtualization-aware software.  Microsoft’s System Center Virtual Machine Manager (SCVMM) is one such product.

If you have even heard of the product, you might be wondering about its capabilities, its architecture, and how you can get started with it.  The January, 2008 issue of Microsoft TechNet Magazine includes an article titled Real Control with Virtual Machine Manager 2007.   From the article’s introduction:

System Center Virtual Machine Manager 2007 is a new solution that provides a consolidated interface for managing your entire virtual infrastructure. Virtual Machine Manager (VMM) can manage existing Microsoft® Virtual Server 2005 installations, and it can also install Virtual Server 2005 R2 SP1 on new virtual machine (VM) hosts. With VMM, the traditional Virtual Server 2005 administrative tasks can now be performed far more efficiently through a centralized interface, with management access across multiple Virtual Server installations.

In addition, VMM brings new capabilities to Virtual Server, including Physical-to-Virtual (P2V) conversions, Virtual-to-Virtual (V2V) conversion of VMware Virtual Machine Disk Format (VMDK) disks to Virtual Server Virtual Hard Disks (VHDs), and rapid VM deployments from templates and pre-configured VHDs via a centralized library of virtual infrastructure objects.

In the following pages, I’ll explore VMM and the powerful set of features it provides to IT administrators. I will then look at the requirements and steps for creating a VMM installation. Finally, I’ll take a deeper dive into a handful of the more exciting features of VMM and leave you with some helpful tips on getting started.

Microsoft is fairly ambitious with the SCVMM product.  In addition to its current features, future updates will be able to manage VMware and Microsoft’s Hyper-V technology (the new virtualization layer that will be included with Windows Server 2008).  See the article and Microsoft’s site for more details.

My Struggles with Windows Vista

As an author of a book on Windows Vista (see details) as well as a beta tester since the early days of the product, I have had a lot of experience with Windows Vista.  Unfortunately, much of my overall impression of Microsoft’s latest operating system is negative.  While there’s no shortage of bad press about Windows Vista, I have found that complaints tend to be illogical, irrational, and atypical.  Rather than relying on specific examples and facts about issues, writers seem to bash Microsoft and Windows Vista for the sake of doing so.  The purpose of this post is for my to (hopefully) point some constructive criticism of the OS and to detail my experiences with it.

About the Author

Let’s start this off with a little background: I generally like Microsoft and its products.  Especially when compared with other Enterprise software companies, I think Microsoft does many things well.  As an IT pro, I have based a large part of my career on their products and technology.  I periodically reevaluate that focus (generally every few years), but I have always found Microsoft’s development platform, client and server platforms, and other products to be very good.  I certainly have no more animosity towards Microsoft than to any other corporation, and I genuinely think that the company will address and resolve the below issues in the future.  OK, with that out of the way…

Testing Vista (and my patience)…

I have run Windows Vista on several computers, including my primary work machines.  Originally, I upgraded a Dell Dimension 9100 computer from Windows XP to Windows Vista.  I have since purchased a Dell Inspiron 530 desktop machine as my primary computer and a Dell Inspiron 640m notebook for traveling.  Both of these machines shipped with Windows Vista.  I am running the Windows Vista Service Pack 1 Release Candidate on both desktop computers. 

I have very few startup programs and have tested most of my issues on "clean" installations of the OS.  I have also done extensive troubleshooting to isolate the causes of driver and software compatibility issues.

Some Evidence…  Consistently Unreliable

My experiences with Windows Vista’s reliability (or lack thereof) have been extremely poor.  Two tools help highlight this fact.  The Reliability And Performance Monitor provides details related to OS crashes and other major events such as application installations, driver updates, etc.  On both of my Windows Vista desktop computers, the overall index has been extremely low (see screen shots below).  In fact, the only way I can seem to get the reliability to increase is to keep the machine powered off (a "solution" I have decided to use for one of the Vista machines).

Vista01

Vista04

More Evidence: Errors

I have experienced literally hundreds of errors on my Windows Vista operating systems.  Granted, some of these can be chalked up to application issues.  But, the number and frequency of issues is just unacceptable.  And, the lack of relevant or useful responses to these issues just adds insult to injury.  I mean, I get it: I need to download updated versions of drivers and applications.  Unfortunately, that simple-minded advice rarely provides any alleviation of the pain.

Vista02

Vista03

The List of Issues

OK, so the stage is set.  Following is list of current issues I have with Windows Vista, along with details. 

  • General Performance: Overall, Windows Vista is sluggish.  I’m currently running the OS on an Intel Core2 Duo E6550 chip (which has 4MB of L2 cache), and even routine tasks take far too long.  Examples include moving and copying files (either locally or over the network).  File enumeration, transfer time estimates, and just plain UI sluggishness are unacceptable.  Windows Vista SP1 makes some improvements here, but every time I use a Windows XP machine, I reminisce about how well things used to perform.
  • Power Management / Sleep Mode: I have been unable to use Sleep mode on my desktop Windows Vista computers for over a year.  In some cases, the systems fail to enter Sleep mode.  In other cases, they’ll enter Sleep mode and either randomly wake up or fail to return to working status.  These problems are consistent, and even after hours of troubleshooting, I have decided I have to leave my computers running all day and night for reliability.
  • Troubleshooting tools: It should be taken for granted that any complex technology will have potential glitches.  Software that is as complex as a Windows OS is certainly no exception.  The key, therefore, is to make it easy to diagnose, identify, and resolve potential problems.  Windows Vista takes a few steps forward in this area by segregating event logs based on specific OS and application areas.  Some tools like the Performance And Reliability Monitor can also be somewhat helpful.  Overall, however, troubleshooting in Windows Vista is a poor experience.  It’s really difficult to track down to the root cause of system instability.  As there are numerous driver and software incompatibilities with the OS, much more robust and in-depth troubleshooting tools are a must-have "feature".
  • Folder Views: It’s really surprising to me how a feature that is designed to assist users by detecting the types of files (music, video, pictures, etc.) seemingly always guesses incorrectly.  Regardless of Registry hacking, file system changes, and various UI features, I find myself constantly changing the default view for my data folders.  And, the process takes numerous clicks.  I either have to add the relevant columns to the display manually or change the view settings for the folder.  And, there’s a good chance that I’ll have to repeat this process the next time I use it.  This "feature" is broken, and a quick "fix" would be to remove or disable Windows Explorer’s folder view features.
  • Switching users: When working on software development and testing, I occasionally create a second user account.  That account will have its own profile which I can modify programmatically or manually to test some behavior.  The idea is to keep from modifying my "real" settings.  Apart from being extremely slow (compared to Windows XP), the chore of switching active users seems to be really buggy.  I sometimes hear sounds that seem to emanate from the other user’s profile (e.g., receiving an e-mail message in Microsoft Outlook).  And, when I log back on to an existing user profile, the video display fails to initialize.  This occurs with numerous versions of Nvidia graphics drivers.  Again, I don’t have a way to effectively troubleshoot the problem.
  • Startup Times: One of the key selling points of Windows XP was the quick startup time.  Even of relatively old hardware, I can cold boot a machine and be up and running in around a minute or so.  Windows Vista is a different story.  On my desktop computers, I often have to wait over five minutes before the system is usable.  That means that, unless the computer has already booted, I can’t even load a web page or open Microsoft Outlook.  This, clearly, is not progress.  One of the main culprits appears to be the Windows Media Player sharing functionality (I have a large collection of local music and video files that I stream to my Xbox 360).  The rest of it just seems to be an inefficient and overly-bloated OS.
  • Spontaneous Reboots:  On several occasions, I have experienced spontaneous reboots of the entire OS.  It’s almost like a power fluctuation – there’s no warning, no blue screen, and not diagnostic information.  Rebooting seems to provide some stability, but this problem can be downright infuriating.
  • General UI Issues: While I can appreciate the time and effort Microsoft put into usability studies for the Windows Vista UI, much of the new organization makes managing the OS far more difficult and clumsy.  And this is well over a year after having time to "adapt" to the new UI features.  Now I certainly recognize that I’m not part of Windows Vista’s core audience.  I am quite technical and often need to do things to the OS that the typical user won’t.  Still, the challenge of viewing IP address settings or managing Control Panel items is painful.  Combined with teh sluggishness of the OS in general, tasks that were quick and easy in Windows XP are a chose in Windows Vista.  Microsoft could (and hopefully will) do much better in the future. 
  • Windows Vista Ultimate Extras: Apart from including a full set of OS features, Microsoft promised enhancements and upgrades to users of Windows Vista Ultimate Edition.  So far, this area has been extremely lacking.  Well over a year after the OS shipped, users are limited to just a few pieces of downloadable content (see the details at the Windows Vista Ultimate Blog).  And even those aren’t very compelling.  I can survive without new and exciting features, but I think this really highlights Microsoft’s lack of commitment to its user base.
  • User Account Control (UAC): We seem to live in a society where just the mere mention of safety or security gives people carte blanche to do whatever they please.  Computer and IT professionals have a long history of doing annoying things to users in the name of "protection".  Often, these things have marginal value (think of airport security lines), but we do them anyway, since it makes us feel like we’re combating a real problem.  UAC is a great example.  By constantly nagging the user to approve certain actions, it provides questionable benefits.  The real goal, here, is to force software developers to finally follow Microsoft’s security standards.  Though many users will disable UAC, the fact that some might leave it enabled forces vendors to finally follow some best practices.  Still, it puts an unnecessary burden on users and will likely be remembered in the same way as Microsoft Office’s Office Assistant, Clippy.
  • Keyboard shortcuts: Gaining true efficiency with a desktop OS involves the use of keyboard shortcuts.  Windows XP did quite well, as far as consistency goes.  I could easily create folders, move files, switch between applications, and work with applications without much trouble.  Several of Windows Vista’s keyboard shortcuts work inconsistently.  And the slow performance of Windows Explorer often leads to "race conditions" which rename the wrong folder or delete an incorrect item.  Shortcuts should be improved upon in future versions of windows, and many more should be added.
  • Product Activation / Windows Genuine Advantage (WGA): I often have the need to install, reinstall, and move OS’s.  I have called Microsoft’s WGA hotline on numerous occasions because Internet activation would fail.  My crime?  Often the change was as simple as upgrading to a larger hard drive (yes, just that seemed to trigger it).  In other cases, I need to run Windows Vista in VM to test functionality or to take screen shots for books and articles.  Activation prevents me from easily performing those tasks, and even with an MSDN subscription, I find myself spending significant time worrying about license activations.  Also, I wish Microsoft would stop trying to claim that reducing piracy is somehow improving security or protecting users.  This is marketing at its finest – harm the user and tell that it’s for their own good.  Microsoft prevents unlicensed copies of the OS from being updated.  That causes more security problems.  And, the real goal here is to increase revenue – not to help the user.  Let’s admit that and see these "improvements" for what they are.
  • Network issues: Occasionally, my network adapter will stop receiving connections for various services.  I rely upon my Windows Vista desktop machine to serve up audio and video content to devices throughout the rest of my house.  Periodically, the audio and video sharing features will stop working and I’m sometimes unable to connect to the computer using UNC shares.  The issues are typically accompanied with little to no help about the reason.  Rebooting the Vista machine will "solve" the problem, but that’s quite painful when I have a lot of applications open.  Network functionality should be taken for granted – this is no longer a luxury.  I’ll take reliability over performance any day, but this doesn’t seem to be an option in Windows Vista.
  • Lack of compelling features: This is, perhaps, the biggest issue to me.  Perhaps most of Windows Vista’s other shortcomings could be overlooked or accepted if the OS provided significant usability, performance, and reliability.  It might be worth the pain to run new applications and use productivity-enhancing features.  Sadly, I just don’t see this in Microsoft’s latest OS.  Developers are barely starting to take advantage of features in Windows Vista, leading to little reason to upgrade.  And, we have given this quite some time.  Windows Vista was finalized over 12 months ago, and industry support is far from perfect.

Conclusion

Admittedly, the purpose of this post is to point out the flaws in Windows Vista.  I do feel that there are numerous excellent features in the OS (and even a few that are keeping me from considering the move back to Windows XP).  Areas such as the network stack have been enhanced significantly and I find myself using integrated search features all the time.  Environments that use Windows Vista (or later) with Windows Server 2008 will see significant benefits.  Still, this situation is far from perfect.

So, in conclusion, I think Microsoft has done a fairly poor job with the quality and features of its latest operating system release.  However, there might be a bright side to all this.  Microsoft does its best work when it needs to catch up or recover from problems, and I’m hoping that the next version of Windows will address these issues.  Those users that haven’t lost faith in the platform (yes, I’m one of them), will hopefully be rewarded.  Unfortunately, a final version of "Windows 7" is years away, and it looks like the wait is going to be a long and painful one…

Update: It seems that I’m hardly alone in my issues with Windows Vista.  Microsplot has a post that quotes numerous industry outlets on the topic.  See Anything but Speechless: 100 Things People Are Really Saying About Windows Vista for details.

Virtual Strategy Magazine: Comparing Virtualization Approaches

Virtual Strategy Magazine has published my latest article: Comparing Virtualization Approaches. The article examines the various approaches to virtualization, including presentation-, application-, and server/hardware-level virtualization.  The following diagram provides a brief overview of the approaches and their details.

image

The overall idea is that organizations have a wide array of choices in deciding how to isolate and consolidate their workloads.  The challenges is picking the right tool for the job.

Consortio Services TechCasts

For those of you that are interested in keeping up to date with IT goings-on, be sure to check out the Consortio Services TechCasts.  The brief intro blurb from the site reads:

CS TechCast is a podcast series released every Wednesday and is hosted by the Consortio Services experts Eric Johnson, Eric Beehler, and Josh Jones. Each week they discuss the latest IT trends, news, and bring you interviews with key members of the IT Community. Frontline information technology professionals should find the information both helpful and relevant to their careers.

I had the honor of being interviewed for the first episode.  Just click below for details, or access the Podcasts page directly.

CS TechCast

Virtualization Trends: Predictions

The SearchServerVirtualization.com has a new post that offers an interesting and thought-provoking topic: What does the future hold for virtualization?  The post, Thoughts on the ‘top five’ trends in virtualization includes editor Hannah Drake’s take on the subject.  I chimed in with my responses:

It’s always fun to make predictions about the future. I’ll join in with a few of mine:

1) Desktop Virtualization/VDI deployments remain limited: Like “thin-client” computing before it, the idea of virtualizing entire desktop environments will fail to gain traction. Certainly, companies are doing this now. But, I think the potential drawbacks won’t be addressed quickly enough (if ever), and other solutions will help address security and manageability issues. Most importantly, though: What does everyone else think?

2) Other forms of virtualization gain traction: Presentation- and application-level virtualization will become much more common, and IT organizations will find that they have many different ways to address potential management issues.

3) Server Virtualization technology will start to become commoditized: Already, numerous companies provide useful Hypervisors and virtualization layers. It’s a cool technology, but many vendors have figured out how to do it. Moving forward, the real challenge will be in managing VM deployments, implementing backups and DR, HA, and dealing with storage issues. The virtualization layer will be considered the “foundation”, whereas management tools will receive the focus.

4) Virtualization Knowledge: For most IT people, managing basic virtualization functions will become a standard job function (like performing backups). There’s nothing shocking there. As virtual platforms get easier to manage, most organizations will need only a few “experts” (such as those that have the VCP certification) to work on design and troubleshooting. The rest of the IT crowd will adapt on their own. This might not be ideal, but I don’t see the VCP certification being as popular as the MCSE c. 1996 – 2000.

Some of this might be going against conventional “wisdom” (and aggressive marketing), but these wouldn’t be very useful predictions if I stayed with the safe bets. It will certainly be interesting to see how things pan out.

I have certainly been in IT long enough to see many fads come and go.  I have also seen many genuinely good ideas become part of standard IT best practices.  It’s probably safe to say that server virtualization fits in the latter camp.  But, there’s still a lot of hype out there, and it’s good to keep things in perspective.

There’s probably a lot more to predict, so I’d be interested in hearing readers’ opinions: What are some other predictions, and what do you think I got wrong?

Embotics White Paper: Controlling VM Sprawl

whitepapersI recently wrote a technical best practices White Paper for Embotics, Inc.  It’s titled Controlling VM Sprawl: Best Practices for Maintaining Control of Virtualized Infrastructures, and is available for free download (registration is required).   The content defines and addresses the issue of "VM Sprawl" – the rapid proliferation of virtual machines that many environments are experiencing.  While virtualization technology can provide numerous benefits in just about all areas of an IT organization’s operations, many people have let issues like security, policies, and processes slide.  Here’s an excerpt from the introduction:

Many IT solutions tend to solve important business and technical problems in ways that can create management-related concerns. Virtualization is no exception. While organizations and their IT staff have quickly realized the many benefits of implementing virtualization, the challenge of controlling virtual infrastructures is one that is often overlooked.

Often, the benefits of virtualization start to become overshadowed by issues of security, administration, and configuration management. The primary cause is often referred to as “VM Sprawl” – the proliferation of virtual machines without adequate IT control. Organizations must recognize that virtual machines are different from their physical ones and the systems and controls that are in place to manage their physical environment may not work well in the virtual one.

In this White Paper, I will discuss the sources of VM sprawl, the dangers inherent in it and present best practices to address these issues. Finally, I will discuss the importance of automated virtualization management solutions. The goal of this white paper is to allow organizations to realize the many benefits of virtualization technology while still maintaining control of their environment.

Download the White Paper and feel free to leave me some feedback!

Information Week Article: Addressing the Challenge of VM Sprawl

I was recently interviewed by Charles Babcock from Information Week for his article, Virtual Machine Sprawl Will Challenge IT Management Skills.  The interview was based on the content of a a White Paper I wrote for Embotics, Inc., a provider of virtualization management solution.  The paper is titled Controlling VM Sprawl: Best Practices for Maintaining Control of Virtualized Infrastructures, and is available for free download (registration required).  I’ll post more about the White Paper in an other blog entry.  From the Information Week article:

Many IT managers don’t know how many virtual machines they’re running and whether they’re secure, says virtualization expert Anil Desai.

Software developers like to use virtual machines because they can cheaply mimic a target environment.

Testers like virtual machines because they can test more combinations of new software with parts of the infrastructure in virtual machines.

Department heads like virtual appliances — applications teamed up with an operating system in virtual machine-ready file format — because they can be downloaded off the Internet, tried out, and pressed into service immediately, without the usual delays.

And each of these examples illustrates how virtualizing the enterprise leads to uncontrolled, virtual machine sprawl, with IT managers not knowing how many virtual machines they’re running, where they’re running, whether they’re offline and stored away, or whether they are secure.

The article raises awareness of the problem of "VM sprawl" – the rapid proliferation of virtual machines, often with little or no IT oversight.  The article and the White Paper provide some best practices for gaining (or regaining) control of virtual machines through policies and processes.  Feel free to leave comments about your own VM management horror stories (and, better yet, solutions)!

Microsoft Assessment and Planning (MAP) Solution Accelerator (Beta)

The goal of the Microsoft Solution Accelerator team is to ease the design and deployment of infrastructures based on Microsoft products.  Earlier this year, I authored guides in their Infrastructure Planning and Design Series (see Microsoft Infrastructure Planning and Design (IPD) Guides Available for details).

In keeping with the same goal, a new beta version of the Microsoft Assessment and Planning Solution Accelerator is available for download from Microsoft Connect.  The description from the download site:

The Microsoft Assessment and Planning (MAP) Solution Accelerator is an integrated platform with tools and guidance that make it easier for you to assess your current IT infrastructure and determine the right Microsoft technologies for your IT needs. It offers easy inventory, powerful assessment and actionable recommendations for Windows Server 2008, Windows Server Hyper-V, Virtual Server 2005 R2, Terminal Services, SoftGrid, System Center Virtual Machine Manager, Windows Vista, and 2007 Microsoft Office. The popular Windows Vista Hardware Assessment readiness tool will be integrated into this platform.

Target Audience

  • Customers: IT Architects, Infrastructure Specialists and Desktop/Application Administrators.
  • Partners: System Integrators, Value-Added Partners and IT Consultants in the Enterprise and Midmarket

Key Benefits

  • Quick Assessment of Your Existing infrastructure and assets
  • Adaptive Guidance and Actionable Proposals that provide specific recommendations that will help simplify your planning and deployment of Microsoft technologies
  • One-Stop Shop for All Your Planning (or Pre-Sales) Needs

The good news is that this is a completely agent-less method of automatically analyzing your entire environment.  The product generates detailed reports that would be tedious and error-prone to create manually.

Overall, the idea is to help organizations determine how best to deploy Microsoft’s virtualization technologies.  If you’re currently considering an expanded virtualization deployment, this tool can help you make better decisions about your infrastructure needs.  Give it a shot, and send feedback to the development team to improve the final version!

Remote Administration in Windows Server 2008

Perhaps one of the most-used features in the Windows Server platform is the Remote Desktop Protocol (RDP).  Just about every administrator relies on it to perform configuration changes, add software, and make other related changes to the system.  By default, new remote connections created with the Remote Desktop Connection (RDC) application are created as additional user logons.  This usually meets the needs, as up to two remote connections are allowed to be active at a time. 

However, a somewhat common requirement in some situations is to actually log on to the "console" session.  This type of connection allows you to connect to the system remotely utilizing a session that runs in "Ring 0".  It’s necessary to troubleshoot potential issues with installing applications, and some particularly picky applications that don’t behave as expected.  Effectively, this is the same thing as logging on to the computer at the physical console.  The bottom line is that it’s like "being there".  In "current" versions of the RDC application, you can use the /console switch to connect to the the console session using RDP.

This behavior is actually deprecated (i.e., retired from future use) in Windows Vista SP1 and Windows Server 2008.  The Terminal Services Team Blog describes the changes (and the reasons for them) in an in-depth posting titled Changes to Remote Administration in Windows Server 2008.  Without getting too deep into the details, the following portion of the article describes the new /admin switch:

Behavior of the /admin switch

You can start the RDC 6.1 client (mstsc.exe) with the /admin switch to remotely administer a Windows Server 2008-based server (with or without Terminal Server installed). However, if you are connecting to remotely administer a Windows Server 2008-based server that does not have the Terminal Server role service installed, you do not have to specify the /admin switch. (In this case, the same connection behavior occurs with or without the /admin switch.) At any point in time, there can be two active remote administration sessions. To start a remote administration session, you must be a member of the Administrators group on the server to which you are connecting.

Overall, for most routine administration, this won’t make a huge difference.  But, eliminating (or at least reducing) the need to connect to the console session is a big step forward.

The Hyper-V Beta is Now Available

  • Update (03/27/2019): Though I generally don’t update old posts, I recently found a useful guide for those that might be interested in more up-to-date information on Windows Server and Hyper-V.  Please see Stephen Coopers, “Ultimate Guide to Windows Server” as an alternate resource.

The Windows Server Division Blog announces the availability of a beta version of Microsoft’s Hyper-V technology.  Hyper-V was formerly known as “Viridian” and later “Windows Server Virtualization (WSv)”.  Previous releases were known as Consumer Technology Preview (CTP) versions.  Generally, beta products have a higher level of quality.  To get the preview version, you’ll need to download the appropriate version of Windows Server 2008 RC1 from the Windows Server Evaluation site.  Note that you can only install the product on Windows Server 2008 Enterprise running on an x64 hardware platform.  Here are some useful links:

Overall, you can expect to see a lot more information on the product (some from me) in the near future.

Understanding Microsoft’s Hyper-V Architecture

This article was first published on SearchServerVirtualization.com

I recently wrote an introductory article focusing on the architecture of Microsoft’s Hyper-V technology.  From the article’s introduction:

Many IT people like to live on the cutting edge, even if it means we might need to purchase some bandages for the datacenter. Advanced is virtualization are commonplace, and it’s generally worthwhile to find out what’s coming out in the not-too-distant future. Microsoft’s upcoming virtualization product – now called Hyper-V – features a completely new virtualization architecture. Hyper-V (formerly known as “Viridian” and Windows Server Virtualization (WSv)), will be made available as a component of the Windows Server 2008 platform.

You can access the entire article on SearchServerVirtualization.com

Note: Some of this content has been superceded with the release of Microsoft’s Hyper-V Beta (see more recent postings in the Hyper-V category for details).

Commodore 64: Love Always

C64_startup_animiertOK, so perhaps "love" is too strong a word.  A friend just sent me a link to an article that really jogged my memory.  I got my start in computers with the Commodore 64 computer, and I have never really forgotten it.  This think had an embedded BASIC compiler.  For those that don’t know, many of the "old school" people used a cassette drive to store programs.  It would take quite a while to load even 100KB.  The CPU ran at a smoking 1.0MHz, which always seemed to be plenty.

You would generally connect this thing to a television set and then proceed to geek out.  In the later days, floppy disk drives became commonplace.  Specialized monitors were also made, so you didn’t have to sit in front of a 25" Magnavox tube TV.  The C64 "scene" was also hopping, with the most popular bulletin board systems (BBS’s) boasting a whopping 40 megabytes of storage space (yes, that’s megabytes).

But, the graphics and sound capabilities of this machine were amazing for the time.  That’s especially true if you compare it to the IBM CGA machines that could only bleep like sheep and display four rather nasty colors (black, white, cyan, and magenta).  And, the IBM boxes cost thousands of dollars whereas you could get a C64 for quite a bit less.  Playing games and typing in code listings from magazines were a great pastime.

The article from CNN is entitled Commodore 64 still loved after all these years.  It certainly was a popular machine:

Often overshadowed by the Apple II and Atari 800, the Commodore 64 rose to great heights in the 1980s. From 1982-1993, 17 million C64s were sold. The Guinness Book of World Records lists the Commodore 64 as the best-selling single computer model.

And I definitely can relate to the quotes at the end of the article:

"Computer nostalgia is something that runs pretty deep these days. The memories that people have of this machine are incredible," McCracken said.

Twenty-five years ago computers were an individual experience; today they are just a commodity, he said.

"I don’t think there are many computers today that we use that people will be talking about fondly 25 years from now.

If you’re interested in emulators and more nostalgia-inducing material, see the C64.com web site.  Wikipedia also has some interesting information on the Commodore 64 (also be sure to check out the External Links section).  Just looking at all that now-ancient plastic and a screenshot of the startup screen really takes me back.  I’m wondering: Am I the only one that remembers what the following commands do (and, yes, this is from my personal memory)?

POKE 53280,16

POKE 53281,4

SYS 64738

In short (OK, I admit it’s too late for that), I think this is the best computer ever created.  I’ll try to post some more about the C64 in the future.

Consortio Services TechCast

CSTestCastLogo I was recently interviewed by Consortio Services as the first interviewee in their new podcast series.  The team of Eric Johnson, Eric Beehler, and Josh Jones asked me numerous questions related to virtualization.  Here’s a quick introduction to the topics:

This week our guest is Anil Desai, who we talk with about virtualization best practices. In the news, detecting wireless intruders, HP buys up more companies, Quicktime exploit, Exchange Server 2007 SP1, and how to keep your IT staff happy. Plus, “The Worst Tech Move of the Week”, “IT Pet Peeve”, and “The Tech Tip of the Week”.

The topics ranged from what’s important for IT people to learn and a comparison of the available technology approaches.  You can download or listen to the free webcast at CS TechCast 1: Going Virtual