Virtual Desktop Savings is not in the Hardware

Here’s a great article that I recently was referred to:

Citrix: Virtual desktops about to become cheaper than physical ones

In it Citrix is helping to develop yet another lower cost zero-client device.  These are literally just a small box about the size of a deck of cards that have ports for a monitor, Ethernet and USB.  No moving parts, very slim, reliable and inexpensive. However, I still maintain that the savings of Virtual Desktop (VDI) are not in the hardware.

I think the real savings are in staffing and time.  Unfortunately these are much harder to measure but well worth the move in my opinion.

You can already use the hardware you have for VDI so new zero-clients only save you money for new installations or when you are replacing a desktop that is completely dead. 

You still need a keyboard, mouse and monitor (video) to hook up to zero-clients and Windows/Office & CAL licenses.  If you subtract the cost of the license & kvm these days you can purchase a reliable desktop for probably about $500, and even less if you plan to purchase refurbished and replace them more often.

So, initially you do probably save $300-$400.  However, you must move these virtual desktops to a server you never had to have before in the data center.  More than likely you are moving them to a high-capacity set of servers, much like the Cisco UCS system we are rolling out out my district.  This is very expensive.

It doesn’t stop there.  My network manager and I attended a VDI training and learned that you have to make sure your network can also withstand the high traffic all the way from the data center to the user’s “desktop”.  It was fine if a user had to wait a few seconds to open a Word doc but waiting a few seconds to click a mouse is unacceptable.  Suddenly you may have to upgrade much of your entire network infrastructure.  This is much the same issue you encounter when you first migrate to IP phones.

Oh, and what about disaster recovery?  With Virtual Desktop you have effectively moved their “desktop” to a data center.  Right now at our district if a desktop dies we advise the user to log onto another one nearby and schedule a tech to replace their desktop within 24 hours.  With Virtual Desktop, if the connection to the data center goes down there is no nearby desktop to go to.  So you have to have a disaster recovery data center (i.e. at our District Office).

This starts sounding really expensive really quick.  Fortunately VMware (et al) have great tools that let you leverage the most out of your equipment.  A classic disaster recovery site is unused most of the year and only kicks in when you have an emergency.  From a business cost perspective it is a loss until those few minutes it is in use.  VMware lets you cluster your data centers together effectively using them in tandem for full production load balancing.  If one were to go down the other would simply take the entire load.  You still have to purchase double the equipment but it is used much more efficiently.

I would imagine that as you scale the costs start becoming more relative.  I would think that if you implemented VDI on 90% of the computers at a site it is much more economical than, say, 50%.

Like I said above, I think the real savings are in staff and time.  If you can reduce the amount of equipment you have to service while increasing its reliability you don’t have to hire additional techs as you grow your desktop base and your current techs can also deal with higher level issues.  Same thing with your network services staff at the data center.  In addition, your users as a whole experience far less downtime and a far more reliable working environment.

How do you measure those benefits?  Theoretically you can measure the amount of technology growth vs. payroll, but you really can’t measure downtime for the users unless you have a very sophisticated ticket system where you can somehow quantize and compare downtime for users.

IIS7 Won’t Respond Over SSL if No Certificate is Selected

Ugh, while the title of this post doesn’t sound like that much of an issue chalk this one up to an experience I hope someone else doesn’t have to deal with.

We had an issue with our A/C unit in our data center last night and several of our servers were shut down due to excessive temperatures.  We are slowly brining them up as we are managing the cooling and they are all fairly coming up as normal. Some of the older servers reported errors with batteries or failed drives but these are fairly routine.

However, we had one of our brand new servers start up with seemingly no issues but IIS 7 was not responding to web requests.

This server serves up only two web applications over SSL and has worked fine for the last month that it has been in service.

Looking at the IIS logs, event viewer and any other sort of diagnostic tool we could think of reported no errors at all…and no connection attempts either.  Connecting from the local host offered no error messages.  Connecting through Fiddler2 only showed the cryptic message, “the server has actively closed the connection”.

Finally after restarting the entire server, the IIS service, the web site and the app pools we were grasping at straws.  Bindings were correct, permissions were correct, doing a “netstat –an” revealed the server was indeed listening on port 443.

In the end, what solved it is in the binding settings the Certificate dropdown showed None.  I selected the self-signed server certificate and the whole thing suddenly came alive.  I attempted to set the SSL certificate back to None, which wasn’t an option anymore.

Of course that makes sense.  How can you serve up SSL traffic when there is no certificate to authenticate the request with?  However, why a server restart caused the certificate to no longer be selected is beyond me. 

And why did IIS never throw an event or some type of log error that said, “Hey, you’re trying to serve SSL but no certificate is selected!”???

Anyway, hopefully this will show up on a Google search for someone else.  Cheers. Smile

Microsoft Surface 2 with PixelSense

This is just amazing stuff.  When Microsoft Surface first came out you knew this was opening the door to the future.  All the flashy futuristic movies where video screens are interactive and everywhere, tables, walls, etc are now coming to reality.

With Surface 2 they have really taken a huge leap forward.  Now they are using LCD screens with what they call PixelSense technology.  Along with R, G and B pixels there is now a fourth pixel that can detect in the infrared range.  The LCD screen actually can detect the objects on the table.

What this does is change the older hardware with complex projectors, cameras and sensors into a sleek tabletop design.

Just to give you an idea here is what the original Surface typically looked like:

The large enclosed portion below the glass is actually hiding the set of projectors and cameras along with the computer hardware.

Now, with most of the complex sensing technology actually built into the LCD they can now produce tables that look like this:

I’ve seen Microsoft Surface products built into the wall at fancy hotels or casinos in Las Vegas.  Now we’ll start seeing a lot more of these pop up everywhere.

It’s an amazing time to be alive! Smile

The Real Debate About Gaming in the Cloud and the Future of Technology

Toms Hardware has a lot of great tech news every day.  This one caught my eye from a gaming cloud vendor talking about where gaming was going.  Here’s a hint, he thinks it’s going to the cloud. Smile

I’m really surprised at a lot of the comments about the article. In fact, I was hard pressed to find a single optimistic one.

A good percentage of the arguments seem to be that "games" should be physical media that you can buy in a store and touch. That’s the same argument music providers and purchasers were pushing 10 years ago. Now if you want to actually visit a store to browse and purchase a CD it’s more because you’re a purist or you want to take a walk in "simpler times". If I hear a song I like and actually want to buy it I don’t think twice about going to my favorite online provider and downloading it to my phone. In fact I love the convenience, speed and knowing that that purchase is mine no matter what happens to my phone. There is no physical medium to break or lose.

Other arguments seem to think that bandwidth is too slow/inaccessible/expensive/capped/et and set in stone. Bandwidth will always get faster and more accessible. I know that right now there are arguments at the cost, ISP caps, geographical limitations, but these have always been the same issues in one form or another. ~15 years ago (I’m 34) I was downloading at 2400bps. 10 years ago ISPs were having wars as to which 56k bps technology was to be used. DSL and Cable then started fighting it out and the government was being lobbied as to whether telco companies should be allowed in the entertainment medium (because Internet was seen as serving video and other "TV killers") or whether cable companies should be allowed in the telecommunications medium (because Internet was seen as communication and "telco killers"). Meanwhile computers have been getting faster and mobile devices smaller/more capable. In the end, 10 years from now communication methods will be much faster and potentially very different, but I’m sure there will be similar "debates" going on.

Another set of arguments seem to be geared towards real games are only fit on consoles or high end desktops. I really don’t know why the gaming genre constantly has to be fit into a small space. Already gaming is on a variety of platforms in variety of forms from simple little text games on old cell phones to Crysis II on a $4,000 gaming rig. The gaming platform as a whole is already incredibly broad and it won’t be getting any smaller.

The real argument in this article is where are high-end games going. Again, I don’t think it matters what one guy (who obviously wants to promote his company and that’s what marketing is, don’t be surprised or offended) thinks about where gaming is going.

In my personal opinion (because hopefully much of the above was objective 🙂 ) I have no problem with another company attempting to push gaming into the cloud with an alternative publishing platform. If I don’t like it I don’t have to use it. No big deal.

However, the potential is actually quite amazing and simply mirrors what other industries have done (i.e. music purchases/distribution and now movie subscription/distribution). I’ll just use the Microsoft XNA platform as an example. Potentially (not yet but potentially) the XNA platform can run on the XBOX, PC, Silverlight and a Windows Phone 7. If Microsoft can take this to the ultimate end then why can’t I subscribe or purchase a game online and play it on my XBOX when I’m at home on my 50" LCD and play it on my laptop or enthusiast PC when I want, then play it online within my Silverlight hardware accelerated browser and finally pop in for a few minutes on my dual-core (or whatever in the future) WP7 phone? I purchased or subscribed to the game and have four platforms. I think the argument in the future would be "what do you mean you’re going to sell me a game on a single DVD and not let me play it on any device I want?"

Right now if I buy a song I would expect to be able to play it on any of my devices in any location I’m at and it would infringe on my right as a customer to be told I can’t play it on my phone and my stereo at home and in my car or on my computer at work. We only allow gaming companies to do this because the current technology doesn’t allow me to move Crysis to my phone as easily as a song. Technology will one day make that available and I fully expect to be able to one day buy/subscribe to a game once and play it on any device I choose to because it’s my game/subscription and my devices.

Heck, I expect that one day bandwidth will be fast enough and $500 PCs will be fast enough that I can travel to another country, take photos, go into an Internet cafe (or just use my phone), upload photos to my online account, retouch them and edit video, make them available to family and friends, and even play a WoW (or whatever) for a little bit all without having to carry around a laptop.

That’s where I think we’re going.

Windows 7 Phone, on it’s way!

We had a great time at the Inland Empire launch of the new Windows 7 Phone.  I got to present on developing for the phone along with Dustin Davis and Oscar Azmitia.  James Johnson and the Inland Empire .Net Users Group hosted the event at DeVry University.  Fun was had by all.

I recorded the sessions so I will have them up soon for all to see how we did.

Take care and happy developing!

We’re all still kids, just with bigger toys

I just saw this article:

http://www.tomshardware.com/news/SFF-RAID-HDD,10319.html

It’s an amazing feat by Will Urbina.  He has an amazing tool shop and knowledge on how to use them.  He custom built his own Small Form Factor case that holds 8 2TB drives for a total of 14 TB raided.  Pretty amazing.

I would count this as a great version 1.0 product.  The two changes I would make for v2.0 would be:

1) Accessibility: I would mount each drive into a removable tray. Changing the drives out is going to be a bear. Having removable trays would not only make this a snap but also allow for hot swapping, a real help when dealing with RAID failures. If you can get a tray with built-in heat sinks for the drive that will lead nicely into my next recommendation.

2) Heat: The heat issue should easily be solved by putting heat-sinks on the drives (or heat-sink trays as mentioned above). Then simply seal the box, put vents on the front left side and channel the air through the front left, across the drives to the right and through the back. The drives will need more spacing to accommodate the trays and airflow but maybe you could switch to 2.5 drives. This will definitely take some engineering, especially to work around removable drive trays, but proper sealing and a good fan in the back would give a good airflow. May need to increase fan speed or add additional fans to the front intake.

 

Awesome idea.  I have a few tools in my tool shop, some of which were gifts, others bought new for projects, and others picked up on Craig’s List or garage sales.  With only a 2 car garage and 3 young kids I have no workshop. :)  But, I have the dream of slowly adding to my tools each year and someday building a sizable work shed in the back.  I would love to be one of those dads that has tools and builds things with their kids.  I always envied my friends who had this and their dad’s shop at their disposal for inspired ideas or school projects.

Ideas like this keep me hoping that this will be a reality someday.

Great job Will!

Microsoft Intune – The Beginning of Small Business IT Management in the Cloud

Microsoft just released information regarding their new cloud management service for small organizations, Microsoft Intune.  you can read about it on their blog post here.

It’s geared towards smaller companies that have between 25 and 2,500 PCs that may not be able to afford a standard IT infrastructure and server deployment.  Honestly, with some of my clients using SBS 2003 with a decent IT consultant (me :)) companies with as little as 15 machines can easily make use of the standard Microsoft infrastructure.  If you’re beyond 100 PCs I don’t know how you would ever manage this effectively without having Windows Server, Active Directory and many of the management tools such as WSUS and a managed virus/malware setup.  But, that’s beyond the point.

What is Microsoft Intune and what does it do for you?  Here are the basics:

  • Manage PCs through web-based console: Windows Intune provides a web-based console for IT to administrate their PCs. Administrators can manage PCs from anywhere.
  • Manage updates: Administrators can centrally manage the deployment of Microsoft updates and service packs to all PCs.
  • Protection from malware: Windows Intune helps protect PCs from the latest threats with malware protection built on the Microsoft Malware Protection Engine that you can manage through the Web-based console.
  • Proactively monitor PCs: Receive alerts on updates and threats so that you can proactively identify and resolve problems with your PCs—before it impacts end users and your business.
  • Provide remote assistance: Resolve PC issues, regardless of where you or your users are located, with remote assistance.
  • Track hardware and software inventory: Track hardware and software assets used in your business to efficiently manage your assets, licenses, and compliance.
  • Set security policies: Centrally manage update, firewall, and malware protection policies, even on remote machines outside the corporate network.
  • Licensing to upgrade all your PCs to Windows 7 Enterprise.  Includes all applicable upgrades to the latest Windows as well as downgrades while you are under the subscription.

Intune is only in beta at the moment.  You can sign up here until May 16th.  It isn’t scheduled to be released in production until next year.  At that time it will be a subscription based service, most likely ona per PC basis. 

A few things of note:

  • The tracking of hardware and software would be nice.  I don’t know if this only tracks PCs or if it also tracks hardware like printers and network appliances and I’m not sure if it tracks non-Microsoft software.  We’ll have to wait and see how thorough their system is.
  • Setting of security policies seem to be limited to templates that affect security settings like Windows Firewall, updates, etc.  It doesn’t seem to be a full fledged Active Directory Group Policy infrastructure. 
  • Allowing the upgrading of all of your PCs to Windows 7 enterprise is a pretty great deal.

Not a replacement for Small Business Server

I don’t see this as a replacement for SBS.  Honestly, I don’t really see anything that can’t already be accomplished by a decent network setup by an IT consultant, and that you don’t have to pay a monthly fee for.  You still have to have someone knowledgeable (or your IT consultant) to handle the setup and monitoring of Intune, so you aren’t getting rid of your IT guy, just adding the management layer on top of your current network.

What does SBS do that Intune doesn’t do?  Pretty much everything else.  It gives you a full fledged AD infrastructure, user/group/hardware authentication/authorization, shared resources such as folders/printers, Exchange, SQL Server, IAS, etc.

Microsoft already makes Exchange available as a subscription based service, though I don’t know if this is technically in the MS Azure cloud yet.  Azure currently also is starting to handle the SQL space. 

I think Intune will really be able to fill the small business space when I can have a SBS server locally to handle shared resources and local caching of my AD/DNS, but then offload everything else to the cloud, including my licensing management of all my MS products including Windows, Office, etc, AD management, GPO management, intranet, etc.  Then this might really be a full on solution that I could see businesses shelling out $50 annually a computer for.

So, am I signing up for the beta?  Yeah, why not.  I’d really like to see how this works out and where it’s headed.  One of my clients is due to renew their annual license for their virus vendor and we haven’t been that happy lately with the product.  So, this will give us a chance to try out the Microsoft offering for little cost (if anything) and see if this really lets me manage the network better.  Having the remote access through Silverlight will be nice.  That way I don’t have to remote into the server and then remote from there.  Until I see actual estimates on licensing though I will be hesitant to upgrade the PCs to Windows 7.

What’s the deal with widescreen monitors?

I understand that everything is going widescreen these days for media.  That’s great when I want to watch HD movies or the super bowl in HD.  However, I still have no idea why this is the current trend for computer monitors.

Maybe if I was a college kid living in a dorm room and my laptop/desktop was my main display for watching tv/movies, but it’s not.  It’s my main display for actually doing office work and development.  I have two 20” Dell monitors (non-widescreen) that have a maximum resolution of 1200×1600.  I love working on them. I can display full page Word documents with ease and developing is wonderful.  Having ample pixels to work with in height is just as important to me as a high performance machine.

Our department just ordered a new set of laptops for our users.  I’m not in IT so we have them give us their recommendations and order from that.  It turns out that they don’t have the time to nitpick every order so Dell has set up a new service where they have their current business model machines set to go and our IT department just clicks the order button. 

What kind of monitor did we get with our new business class laptop and docking station?  A widescreen Dell 19” E1910H.  It has a maximum resolution of 1280 by 720.  720 pixels in height?  Ouch.  That’s less than a standard 15” monitor with a resolution of 1024 x 768.  Today’s standard (in my opinion) for office machines would be 1280 x 1024 or more.

Here is a picture of the monitor with a Word document at 100%.  You can barely see half the page:

clip_image002

Here is a picture of the monitor with a Word document at full screen.  It’s hard to tell in the picture but you can’t read the text.  this might be nice for an Excel user but for documents it’s terrible.

clip_image002[4]

Oh, notice the stand too.  Do you like the cardboard box underneath?  The stand only allows a slight tilt up or down.  It has no height adjustment or rotation.  In my opinion the Dell P190S is a much better office class monitor.  It has a maximum resolution of 1280×1024 and a stand that tilts, pivots and has height adjustment along with the usual extras like multiple inputs and USB ports. 

In the end I can say it was my fault for not verifying the details on every single line item (there are like 40 on a Dell order) but isn’t that the point of having Dell offer their recommended office equipment, so that my IT department or I doesn’t have to spend the time to do this?

My Planned VHD Organization

In my last post I talked about wanting to move my entire computing environment over to VHDs.  Not just a development environment or test environments but everything.  This would include a general work VHD for my wife and I, a video production VHD for all my video stuff, a production development VHD, various server VHDs (mostly for use during development using Virtual PC) and whatever test VHDs I want.  I would no longer be booting into a standard operating system installation as we have been since the beginning of personal computers.

Some of you might be thinking,”Why in the world do this?”  Well, the last time I had to rebuild a computer it took approximately 2 days.  This included installing the O/S, MS Office, developer tools, all my utilities, plug-ins, and all the updates.  This is wasted time.  Plus, there are often times that I’ve loaded a tool I wanted to check out only to find out it has hosed something of my system.  Maybe it’s not something critical but it’s enough to force me to spend a few hours trying to weed out what files or settings got changed.  If I could test out tools in a exact and isolated environment just by copying my current VHD how incredible would that be?  Plus, assuming I backup my VHDs regularly, let’s say I got a really nasty virus or something that wiped out my system.  No big, I just delete that VHD and restore it from the backup.  Since we’re talking about an entire VHD it’s just a file copy, not a restore that takes hours.  Talk about system restore. :) 

So, as usual, before I start researching the details of how things work my wheels started spinning faster than a hamster running from the cat.

I heard about differencing disks and started looking into them.  Differencing disks allow you to create a “parent” VHD and a “child” VHD.  For instance, a common example is testing how your website looks with different versions of IE, which cannot be installed simultaneously.  You create a parent VHD with the operating system and whatever other software you want in the base.  You then create several child differencing VHDs that reference the parent.  Each child has a different version of IE, so one with IE 6, one with IE 7 and one with IE 8.  The child VHDs are called differencing disks because they only contain the information that is “different” than the parent.  You can also use a child differencing disk as a parent to another differencing disk, thus chaining them.

This really got my heart going.  I instantly thought of a grand hierarchy design like the following:

VHD_Original_Layout

The base Win7 VHD is the only true parent.  All the others are child differencing disks.  I could update the base Windows 7 VHD with whatever updates Microsoft would throw out and all the other VHDs would get it.  All bottom level child VHDs (video, production, test, etc) would have Office, Windows Live, etc.  It seemed like a perfect environment.

Here’s the catch.  A parent VHD cannot be changed.  If you do so, the child differencing disk will be corrupted (because the differences no longer are current) and you will loose the child VHD and all data in it.  Microsoft even recommends that you set parent VHDs to read only to protect against inadvertent changes.  Wow, that’s a real bummer.  I believe some of the enterprise level virtual vendors let you do scenarios like this across serves, thus making it easy to deploy updates and all, but that doesn’t help me.  I want to run native VHDs on my Win7 machine.  Oh well.  Scratch that.

I think technically this could be set up just for the sheer exercise, though I won’t bother to take the time to try it.

So, now that reality has stepped in how do I plan to set this up?  I’ll have a separate standard VHD per environment with no differencing disks.  A great tip from Stephen Rose during his Virtualization 101 for Developers presentation was to make copies of your base VHDs, mark them read only and put them in a backup folder.  That way if you ever want to create a new one you don’t have to start from scratch. 

So, I plan to create my base Win7 image.  I’ll put a read only copy of that into a VHD Backup folder.  I’ll then install Office, Windows Live, etc onto that image, and put a read only copy of that into the backup folder.  At this state this is my general use VHD.  I’ll make a copy of that, rename it as my video production VHD and install my video production software on that.  I’ll make a read only backup of that as well incase it gets hosed or I want to try out some fancy new video software in the future without harming my current VHD.  I’ll make another copy of my general use VHD, rename it to my production development VHD and install all my development software.  You get the idea.

When you boot off of a VHD you always have access to the base C drive (which will be a different drive letter).  So, as far as document storage there are details there I’ll have to work out.  I don’t know how security is worked out but I doubt I’ll be able to access documents in my original My Documents folder.  I’ll have to see what the best practice is out there.  I’ll probably have a generic data storage folder on the C drive that I’ll keep all of my documents in.  I only have one drive in my laptop so I can’t throw everything onto a D drive like my desktop.

Just a few tips to leave you with:

  • Backups are important!  VHDs can get quite large and so are my video files.  While most other documents are small and I can use some sort of cloud storage for my documents or versioning system for my code this won’t save me the time and hassle of losing my VHDs.  I have yet to see an online backup provider like Mozy or Carbonite that has the upload bandwidth I would require for something like this.  I’ll plan on having a couple of eSATA drives that I can back up to on a regular basis.
  • You can’t boot from a VHD on an external drive.  That’s just the way how Microsoft’s I/O mini-port driver works.  They probably did this to make sure you can’t set your laptop to boot from something that might not be there.  Plus, if your external drive got disconnected for any reason, such as a loss of power or your USB cable falling out, it might instantly corrupt your VHD.
  • If there is not enough drive space to expand your entire VHD you can’t boot from it.  For instance, let’s say you have 250GB available on your hard drive and you create VHDs that can expand to 150GB.  If over time you get to a point where you don’t have at least 150GB available to the VHD (including the current physical size of the VHD) you can’t boot into it.  Until you can get rid of some data that VHD will be effectively unbootable.  This is probably a good reason to keep your existing O/S on your system rather than having a pure VHD system. 

I’ll keep updating how it goes. 

Converting my entire production environment to VHDs

Ever since virtual PC type systems have been out, long has been the dream to run any type of operating system you want, multiple operating systems at the same time, all working independently of each other.

However, only in the last year or two has this really become a reality with the latest hardware supporting virtual environments natively.  Now Windows 7 lets you boot off of a VHD as if it was a standard hard drive making full use of your actual hardware devices.

After attending Stephen Rose’s presentation on Virtualization 101 for Developers my eyes were suddenly open to the possibilities of what could be done.  I could move my entire development environment to a virtual PC image.  This would allow me to copy that image at any time allowing me to test beta products, new upgrades, etc without worrying about damaging my current environment.  Many times I have upgraded to new releases (or betas) that caused previous programs to stop running, Dlls required in legacy projects to be removed, etc.  In some rare cases (SQL Server 2008 RC2) I had to reload my entire environment because I could never get my machine back to its original state.

Now that Windows 7 allows you to boot from a VHD my wheels are suddenly turning.  I could move my entire computer use (not just development) to VHDs.  This would allow me to have a base image for general tasks like email, MS Office, etc that my wife and I could use for general day to day tasks.  I could have another environment for all my video editing work, which is especially useful since video compression codecs can sometimes cause havoc.  I could have a production development environment and as many beta environments as I want.  If I want to download a 30 day trial of any product I just create a copy of the appropriate VHD, boot off of that and try out the product.  After 30 days if I don’t like it I just delete the testing VHD and it will have never touched my production VHD.  If I do like the product I just delete the testing VHD, purchase the full copy and install it to my production VHD.  Very nice and clean.

I just purchased a new HP dv7 laptop with the Intel core i7.  So, this is the perfect time to get started.  While it is possible to wipe the drive clean and use VHDs without any core operating system at all I’ll leave the HP environment intact.  That way I can always fall back to that just in case.  If for some reason my sound stops working I doubt HP will be willing to care if I can’t reproduce it in the standard OS.

So, I’ll document how I set this up and how it works out in future blog posts.  Enjoy the ride!