What’s the deal with widescreen monitors?

I understand that everything is going widescreen these days for media.  That’s great when I want to watch HD movies or the super bowl in HD.  However, I still have no idea why this is the current trend for computer monitors.

Maybe if I was a college kid living in a dorm room and my laptop/desktop was my main display for watching tv/movies, but it’s not.  It’s my main display for actually doing office work and development.  I have two 20” Dell monitors (non-widescreen) that have a maximum resolution of 1200×1600.  I love working on them. I can display full page Word documents with ease and developing is wonderful.  Having ample pixels to work with in height is just as important to me as a high performance machine.

Our department just ordered a new set of laptops for our users.  I’m not in IT so we have them give us their recommendations and order from that.  It turns out that they don’t have the time to nitpick every order so Dell has set up a new service where they have their current business model machines set to go and our IT department just clicks the order button. 

What kind of monitor did we get with our new business class laptop and docking station?  A widescreen Dell 19” E1910H.  It has a maximum resolution of 1280 by 720.  720 pixels in height?  Ouch.  That’s less than a standard 15” monitor with a resolution of 1024 x 768.  Today’s standard (in my opinion) for office machines would be 1280 x 1024 or more.

Here is a picture of the monitor with a Word document at 100%.  You can barely see half the page:

clip_image002

Here is a picture of the monitor with a Word document at full screen.  It’s hard to tell in the picture but you can’t read the text.  this might be nice for an Excel user but for documents it’s terrible.

clip_image002[4]

Oh, notice the stand too.  Do you like the cardboard box underneath?  The stand only allows a slight tilt up or down.  It has no height adjustment or rotation.  In my opinion the Dell P190S is a much better office class monitor.  It has a maximum resolution of 1280×1024 and a stand that tilts, pivots and has height adjustment along with the usual extras like multiple inputs and USB ports. 

In the end I can say it was my fault for not verifying the details on every single line item (there are like 40 on a Dell order) but isn’t that the point of having Dell offer their recommended office equipment, so that my IT department or I doesn’t have to spend the time to do this?

epiphan VGA2USB is a no go on Windows 7

I’ve been attempting to record our presentations at the Inland Empire .Net Users Group for a while now.  epiphan graciously donated one of their VGA2USB products to help us capture the presenter’s live screen without having to install any software.  It’s been really helpful. 

I just upgraded to an HP dv7t 3000 core i7 laptop running Windows 7 Ultimate 64-bit.  Apparently epiphan’s 64-bit driver doesn’t seem to work on Windows 7 Ultimate 64.  While their bundled video recording software captures the input just fine I have not been able to capture anything other than a one second black video using Windows Media Encoder 9 64-bit, Microsoft Expression Encoder 3, Camtasia Studio 6.0.3, VirtualDub 32-bit or VirtualDub 64-bit.  Ugh.

epiphan has a stable version of their 32-bit driver along with a beta version that reportedly works with Windows 7.  Neither were able to correctly install on my laptop.  Their 64-bit driver was able to install and recognize the device just fine but I can’t capture video with anything other than their built-in program.  Unfortunately their built-in program does not capture audio.  When you capture just video, most of the codecs have a variable framerate which causes the audio and video to go out of sync throughout the entire presentation.  Recording them together on Windows XP using Windows Media Encoder worked like a champ.

Oh well.  While their website offers little in the way of support they have usually responded to me within about 24-48 hours when I email them.  Hopefully we can get this resolved before the next user group meeting.

This is just scary. T-SQL with “shapes”

I was reading one quick article on SQL and code formatting.  Now that SQL (2005+ I believe) allows for international characters they gave a pretty scary example of what can be done:

CREATE TABLE "╚╦╩╗" ( "└┬┴┐" nvarchar(10))

DECLARE @ nvarchar(10) set @=‘═’

INSERT INTO "╚╦╩╗"

    ( "└┬┴┐" )

SELECT replicate(@,5)

SELECT *

FROM"╚╦╩╗"

DROP TABLE "╚╦╩╗"

This is no joke.  I just did it and it worked great.  :-0

MVC Bug with Default DropDownList or ListBox HTML Helpers

There are a lot of examples out there on how to pass a SelectList or MultiSelectList object to an Html.DropDownList or Html.ListBox helper.  However, I couldn’t find anything that was explaining the problem I was having.

In my app we have a Show object that represents a podcast.  Each show may have multiple Guest objects, which are accessed by the Show.Guests member.

In the Admin controller I have the following Action method:

public ActionResult EditShow(int id)
{
var show = _repository.Get<Show>(id);
show.Guests.Load();
var guestList = new MultiSelectList(_repository.List<Guest>().OrderBy(g => g.Name), "Id", "Name", show.Guests.Select(g => g.Id));
ViewData["GuestsSelectList"] = guestList;
return View(show);
}

In the Edit View I display the fields for the Show and a ListBox for the Guests.  You can select multiple Guests for a show.  Ideally, when you edit a Show the Guests that are already associated with the Show are pre-selected.  Makes sense, right?

Here is the code called in the View:

<%= Html.ListBox(" GuestListBox ", (MultiSelectList)ViewData[" GuestsSelectList "])%>

If you follow this you’ll notice that I am adding to the ViewData a MultiSelectList object, which contains an IEnumerable collection of all the Guests, that “Id” is the value field, “Name” is the text field, and an IEnumerable collection of the guest.Id values that are already associated with the show.  However, this never caused the ListBox to select the values upon rendering the page.

I used the debugger and, sure enough, my MultiSelectList object was successfully populated with the properly selected items.  What was the deal?  The breakdown must be happening with the Html.ListBox helper.  I spent a couple of hours scouring this myself and scouring the Internet.  I went home determined to re-write the Html.ListBox helper myself and just be done with it. 

However, the next morning I decided to download the MVC source.  Here’s a great and simple article from Steve Sanderson on how to attach it to your project so that you can step through it with the Visual Studio debugger.

One nice thing about Html.ListBox and Html.DropDownList helpers is that you don’t have to provide a collection of SelectItems explicitly.  If you have an item in the Model or ViewData that has the same name as the ListBox or DropDownList then it will find it automatically.  If you do explicitly provide the SelectItem collection you can opt to not provide the default values explicitly.  Again, as long as you have a object (or member of an object) that has the same name as the ListBox or DropDownList it will find it and use it.

Here’s the rub though.  If you explicitly define the SelectList collection and the default values (as I did in my code) the Html helper looks to the Model and ViewData first before your default items.  This normally never causes a problem, until you name your ListBox or DropDownBox the same name as one of your objects or members.

Notice that I called my DropDownList “Guests”? Notice how the Model is a Show object, which has a Guests member variable?  It turns out that the Html helper was grabbing this collection, rather than my values.  Technically these are the guests I want to default to, however, the Html helper was getting a collection of Guest objects not a collection of Guest Id values.  When it compared each guest Id in the full list with the actual Guest objects in the default list it never found a match.  Thus, nothing was ever selected.

What’s the fix?  Simply change the name of your ListBox or DropDownList.  When I changed the name value to “GuestListBox” all worked well, because, as you would expect there is no object or member in the Model or ViewData called “GuestListBox”.  Once it couldn’t find one it dropped through that code and used the default values I provided.

To find exactly where MVC is doing this check out the SelectInternal member in the SelectExtensions.cs file.  This is the internal method that both ListBox and DropDownList eventually call.  You can see commented code that states:

// If we haven’t already used ViewData to get the entire list of items then we need to
// use the ViewData-supplied value before using the parameter-supplied value.

In my opinion this is a bug.  It shouldn’t even check the ViewData or Model if you have already explicitely provided the default values.

So, I added my vote to the bug that was already submitted for this on CodePlex.  I haven’t check to see if this has been resolved in MVC v2.

My Planned VHD Organization

In my last post I talked about wanting to move my entire computing environment over to VHDs.  Not just a development environment or test environments but everything.  This would include a general work VHD for my wife and I, a video production VHD for all my video stuff, a production development VHD, various server VHDs (mostly for use during development using Virtual PC) and whatever test VHDs I want.  I would no longer be booting into a standard operating system installation as we have been since the beginning of personal computers.

Some of you might be thinking,”Why in the world do this?”  Well, the last time I had to rebuild a computer it took approximately 2 days.  This included installing the O/S, MS Office, developer tools, all my utilities, plug-ins, and all the updates.  This is wasted time.  Plus, there are often times that I’ve loaded a tool I wanted to check out only to find out it has hosed something of my system.  Maybe it’s not something critical but it’s enough to force me to spend a few hours trying to weed out what files or settings got changed.  If I could test out tools in a exact and isolated environment just by copying my current VHD how incredible would that be?  Plus, assuming I backup my VHDs regularly, let’s say I got a really nasty virus or something that wiped out my system.  No big, I just delete that VHD and restore it from the backup.  Since we’re talking about an entire VHD it’s just a file copy, not a restore that takes hours.  Talk about system restore. :) 

So, as usual, before I start researching the details of how things work my wheels started spinning faster than a hamster running from the cat.

I heard about differencing disks and started looking into them.  Differencing disks allow you to create a “parent” VHD and a “child” VHD.  For instance, a common example is testing how your website looks with different versions of IE, which cannot be installed simultaneously.  You create a parent VHD with the operating system and whatever other software you want in the base.  You then create several child differencing VHDs that reference the parent.  Each child has a different version of IE, so one with IE 6, one with IE 7 and one with IE 8.  The child VHDs are called differencing disks because they only contain the information that is “different” than the parent.  You can also use a child differencing disk as a parent to another differencing disk, thus chaining them.

This really got my heart going.  I instantly thought of a grand hierarchy design like the following:

VHD_Original_Layout

The base Win7 VHD is the only true parent.  All the others are child differencing disks.  I could update the base Windows 7 VHD with whatever updates Microsoft would throw out and all the other VHDs would get it.  All bottom level child VHDs (video, production, test, etc) would have Office, Windows Live, etc.  It seemed like a perfect environment.

Here’s the catch.  A parent VHD cannot be changed.  If you do so, the child differencing disk will be corrupted (because the differences no longer are current) and you will loose the child VHD and all data in it.  Microsoft even recommends that you set parent VHDs to read only to protect against inadvertent changes.  Wow, that’s a real bummer.  I believe some of the enterprise level virtual vendors let you do scenarios like this across serves, thus making it easy to deploy updates and all, but that doesn’t help me.  I want to run native VHDs on my Win7 machine.  Oh well.  Scratch that.

I think technically this could be set up just for the sheer exercise, though I won’t bother to take the time to try it.

So, now that reality has stepped in how do I plan to set this up?  I’ll have a separate standard VHD per environment with no differencing disks.  A great tip from Stephen Rose during his Virtualization 101 for Developers presentation was to make copies of your base VHDs, mark them read only and put them in a backup folder.  That way if you ever want to create a new one you don’t have to start from scratch. 

So, I plan to create my base Win7 image.  I’ll put a read only copy of that into a VHD Backup folder.  I’ll then install Office, Windows Live, etc onto that image, and put a read only copy of that into the backup folder.  At this state this is my general use VHD.  I’ll make a copy of that, rename it as my video production VHD and install my video production software on that.  I’ll make a read only backup of that as well incase it gets hosed or I want to try out some fancy new video software in the future without harming my current VHD.  I’ll make another copy of my general use VHD, rename it to my production development VHD and install all my development software.  You get the idea.

When you boot off of a VHD you always have access to the base C drive (which will be a different drive letter).  So, as far as document storage there are details there I’ll have to work out.  I don’t know how security is worked out but I doubt I’ll be able to access documents in my original My Documents folder.  I’ll have to see what the best practice is out there.  I’ll probably have a generic data storage folder on the C drive that I’ll keep all of my documents in.  I only have one drive in my laptop so I can’t throw everything onto a D drive like my desktop.

Just a few tips to leave you with:

  • Backups are important!  VHDs can get quite large and so are my video files.  While most other documents are small and I can use some sort of cloud storage for my documents or versioning system for my code this won’t save me the time and hassle of losing my VHDs.  I have yet to see an online backup provider like Mozy or Carbonite that has the upload bandwidth I would require for something like this.  I’ll plan on having a couple of eSATA drives that I can back up to on a regular basis.
  • You can’t boot from a VHD on an external drive.  That’s just the way how Microsoft’s I/O mini-port driver works.  They probably did this to make sure you can’t set your laptop to boot from something that might not be there.  Plus, if your external drive got disconnected for any reason, such as a loss of power or your USB cable falling out, it might instantly corrupt your VHD.
  • If there is not enough drive space to expand your entire VHD you can’t boot from it.  For instance, let’s say you have 250GB available on your hard drive and you create VHDs that can expand to 150GB.  If over time you get to a point where you don’t have at least 150GB available to the VHD (including the current physical size of the VHD) you can’t boot into it.  Until you can get rid of some data that VHD will be effectively unbootable.  This is probably a good reason to keep your existing O/S on your system rather than having a pure VHD system. 

I’ll keep updating how it goes. 

Converting my entire production environment to VHDs

Ever since virtual PC type systems have been out, long has been the dream to run any type of operating system you want, multiple operating systems at the same time, all working independently of each other.

However, only in the last year or two has this really become a reality with the latest hardware supporting virtual environments natively.  Now Windows 7 lets you boot off of a VHD as if it was a standard hard drive making full use of your actual hardware devices.

After attending Stephen Rose’s presentation on Virtualization 101 for Developers my eyes were suddenly open to the possibilities of what could be done.  I could move my entire development environment to a virtual PC image.  This would allow me to copy that image at any time allowing me to test beta products, new upgrades, etc without worrying about damaging my current environment.  Many times I have upgraded to new releases (or betas) that caused previous programs to stop running, Dlls required in legacy projects to be removed, etc.  In some rare cases (SQL Server 2008 RC2) I had to reload my entire environment because I could never get my machine back to its original state.

Now that Windows 7 allows you to boot from a VHD my wheels are suddenly turning.  I could move my entire computer use (not just development) to VHDs.  This would allow me to have a base image for general tasks like email, MS Office, etc that my wife and I could use for general day to day tasks.  I could have another environment for all my video editing work, which is especially useful since video compression codecs can sometimes cause havoc.  I could have a production development environment and as many beta environments as I want.  If I want to download a 30 day trial of any product I just create a copy of the appropriate VHD, boot off of that and try out the product.  After 30 days if I don’t like it I just delete the testing VHD and it will have never touched my production VHD.  If I do like the product I just delete the testing VHD, purchase the full copy and install it to my production VHD.  Very nice and clean.

I just purchased a new HP dv7 laptop with the Intel core i7.  So, this is the perfect time to get started.  While it is possible to wipe the drive clean and use VHDs without any core operating system at all I’ll leave the HP environment intact.  That way I can always fall back to that just in case.  If for some reason my sound stops working I doubt HP will be willing to care if I can’t reproduce it in the standard OS.

So, I’ll document how I set this up and how it works out in future blog posts.  Enjoy the ride!

Netflix and the Extinction of DVDs, et al

It was recently announced that Netflix will add streaming to the Nintendo Wii as one of its capabilities.  They already stream to personal computers, the Microsoft Xbox, Sony Playstation and various small devices.

As we just received a Wii for Christmas I am excited to try this out.

As a technology geek I have long thought out how to move my entire media collection (photos and personal videos, music and DVDs) to a computer in our house that we could watch from any tv or listen on any stereo.  This is already easily done but the funds are a little out of reach for our growing family and it certainly isn’t available for the masses.

That’s where Netflix comes in.  They revolutionized the DVD rental industry when they allowed you to rent through the mail with no late fees.  While I was dubious at first at whether their business model could really turn a profit with that much overhead they have done an incredible job.  I’m really glad they have made it through the roughest part.

When Blockbuster saw the competition Netflix was creating they started their own DVD rent by email service, however, they one upped Netflix by allowing you to return them back to your local store.  With Netflix you had to send your DVDs in before they mailed you your next batch.  With Blockbuster, you could return your DVDs back to any local store and rent another right there.  No waiting.

Around the same time Apple thru iTunes, Amazon and Netflix allowed you to download your movies.  However, this was really confined to users who watched on their computers or had laptops or iPods connected to their TVs.  This was a very small market mostly consisting of consumers with technical knowledge able to afford the equipment or college students.

However, then comes Netflix streaming content to the Xbox, Playstation and now Wii.  Suddenly they have an audience that is already familiar with playing rich content on their own devices.  Console gaming devices are almost as much of a part of any home entertainment system as is a DVD player or stereo system.  These days connecting gaming systems to the Internet is a simple task. 

In my opinion, Netflix is the leader in bringing true streaming movie content to the general masses.  While cable and satellite companies have been offering this for years it just hasn’t really caught on.  In talking with my friends, we all have cable or satellite offering the feature, but we just don’t use it.  I don’t know if it is the limited available content or the pricing.  But we just don’t use it.

Another incredible thing that Netflix has done is completely change the overhead they are required to keep.  As they move more and more of their customers to streaming content, their assets will be completely digital available anytime and anywhere in an instant.  There will be no need to ship physical and fragile media all over the country amounting to an incredible cost in shipping time, warehouse inventory, staff, etc.

By giving the general media consumer instant access to movies on general devices they already own Netflix is pushing the DVD out to pasture and really making content over the Internet a reality.

There will always be consumers who prefer to own physical media, but the writing on the wall is even more pronounce than ever before.

Roy Osherove’s TDD Code Review Videos

I’m going to be doing a talk on Practical TDD at the Inland Empire .Net Users’ Group in the near future.  We’ve had TDD talks in the past but they have been more overviews with only a little code.  I plan to take the group through craeting a simple blog engine from scratch completely using TDD.  I hope that this will give a real great example of how to develop a project using TDD.

The presentation was supposed to be in August, but as that lands directly on my 8 year wedding anniversary and our son’s 3rd birthday I had to postpone it.  I don’t like sleeping on the couch. 🙂  James Johnson, the president, is still working on where to put my talk, so I’ll post an update when that happens.

Anyway, in the spirit of getting ready for this talk I’ll be writing a few blog posts on some great TDD resources I’ve found.

Recently Roy Osherove, the author of the Art of Unit Testing, started doing actual TDD code reviews on his blog.  These are great!  It’s one thing to read a blogger’s posts with their thoughts and opinions on topics but seeing Roy go through a live code review in a video is just spectacular.  It’s like getting a chance to sit in on a session with Roy.

While he admits that he’s a little harsh on the Nerd Dinner review, having done it at 2am, I think he is spot on in his findings.  Of course, Roy is a very experienced veteran in this area so we’d think nothnig less.  However, the points Roy makes are great and he really brings to light what good and bad examlpes of TDD he sees in the app.

I can’t wait to check out the rest of the videos.

He offers to review code that is sent in.  I’m not sure I’ll be brave enough to do that, but if I’m not that would indicate a smell in my code, wouldn’t it? 🙂 Maybe when I get my talk together I’ll pass my code on to Roy to review prior to my presentation.  That would really give me a great resource to make sure my presentation is sound and a little more authority on this topic.

Map IP Address to a Geographical Location

Here’s a great article on how to get a geographic location from an IP address within SQL Server:
http://www.sqlservercentral.com/articles/SQL+Server/67215/

The article is very easy to follow and gives great direction of setting up a user defined function in SQL to give you back a location based on the IP address.  Since SQL servers are very good at processing data quickly this seems like a natural way to get this information.  Once you have the function set up you can easily use it in asynchronous processes like analyzing logs, post processing of customer data, etc.  You can also set this up as a trigger within SQL or a service callable by outside apps, such as a webapp.

I’ve seen this used in a lot of ways that I don’t care for (thanks for letting me know about all the fictional hot girls that live in my area, but I don’t think my wife would approve :)) but there are some legitimate ideas coming around.  For instance, let me power up my iPhone and see what Nerd Dinners are available in my area (work in progress).

Another scenario is blocking spam.  For instance, at my work we service Riverside County in southern California, USA.  We have methods to stop unauthorized users from creating accounts and blocking spam to our Wiki’s and such.  But why not use location based blocks as well?  I know my users are all from Riverside County, so why not block everyone from, say, outside of southern California?  While a user or two may be blocked while attempting to access their work from their vacation in Maui, I don’t think I’d get that much flack from blocking these edge cases.

Serving Dynamic SSRS Excel Formatted Documents Via a MemoryStream

On an old Web Forms app we had the request to allow users to download the data from a formatted SSRS report as an Excel spreadsheet.  They currently view the formatted report using the ReportViewerControl, however, when you use the export feature it exports it to Excel with all the formatting of the report, including groups.  This is fairly unusable.  What you really need is a simple spreadsheet with a cell for every value without any grouping.

So, I created a report that had no grouping or formatting and a column for every field of data. 

I could have forced the user to view this ugly report in the ReportViewer and then export it as an Excel spreadsheet but I wanted them to be able to click on a link and get the report directly.  That’s easy enough because you can get an SSRS report formatted as Excel by appending “rs:Format=Excel” to the end of the report url.  This doesn’t work for us, however, for two reasons:

1) The website is accessible to users outside our network and the SSRS server is not.

2) The report retrieves sensitive data filtered by a parameter.  It would be fairly easy for a user to change the parameters in the url to obtain data they shouldn’t be viewing.

In the ReportViewerControl we change the parameters on the server side so the user simply views the generated report with no option to change the parameters.  Now, I needed a way to let them download the Excel version with the same restrictions.

Below is the solution I used with the help of several different other sites on the web.

I created a hanlder that would create the url with the proper parameters and formatting, retrieve the report and then send it out to the user as an Excel document. This way when the user clicked the link for Excel version of the report their browser would open a dialogue to Open or Save the Excel file.  Works like a charm.

 

Here is the code:

1: Public Sub ProcessRequest( ByVal context As HttpContext) Implements IHttpHandler.ProcessRequest
2: context.Response.Clear()
3: context.Response.BufferOutput = True
4: context.Response.ClearContent()
5: context.Response.ClearHeaders()
6:
7: Dim uri As String = ” http://san-destiny/ReportServer?%2fSELPA+Reports%2fDistrict+Errors+Detail+CSV+-+CASEMIS&rs:Format=Excel&DistrictName= ” & Utility.GetDistrictNameForUser()
8:
9: Dim request As HttpWebRequest = HttpWebRequest.Create(uri)
10: request.Credentials = CredentialCache.DefaultNetworkCredentials
11:
12: Dim response As WebResponse = request.GetResponse()
13: Dim responseStream As System.IO.Stream = request.GetResponse.GetResponseStream()
14:
15: Dim buffer(4096) As Byte , blockSize As Integer
16: Dim tempStream As New MemoryStream
17: Do
18: blockSize = responseStream.Read(buffer, 0, 4096)
19: If blockSize > 0 Then tempStream.Write(buffer, 0, blockSize)
20: Loop While blockSize > 0
21:
22: context.Response.AddHeader(” Content-Type “, ” application/xls “)
23: context.Response.ContentType = ” application/xls
24: context.Response.AppendHeader(” Content-disposition “, ” attachment;filename=CASEMISErrorReport.xls “)
25: context.Response.BinaryWrite(tempStream.ToArray())
26:
27: context.Response.Flush()
28: context.Response. End ()
29: End Sub

 

The first few lines simply clear any headers or content that may be initially set.  This is mandatory when sending files because we are going to be setting the headers later.

Line 7 is the url of our report.  Notice that “rs:Format=Excel” is in the url letting SSRS know we want this report as an Excel doc.  Also notice that I set the DistrictName parameter by getting the logged in user’s district name via a utility method.

Line 8 sets up an HttpWebRequest object so that we can make a call to the SSRS server just as you would with a browser. 

Since the SSRS server is on our domain and user restrictions are in place Line 9 sets the credentials we need. Our webapp impersonates a user with the correct access so that we can have the proper security restrictions in place yet allow non-domain users access to the data.  This impersonation is set up in the web.config and DefaultNetworkCredentials looks their first to get the credential information.

Lines 12 & 13 simply grab the web response and sets up a Stream object to read it with.

Lines 15 – 20 sets up a temporary MemoryStream object which holds the bytes read from the response (remember SSRS is sending us an Excel file, which is a binary file, not plain text).

Lines 22 – 28 finally sets up the headers and content type, sends the bytes and closes the stream.

 

I probably didn’t need a MemoryStream object.  In the loop I probably could have written the bytes directly to the context.Response object.  But this seems a little easier to read and potentially debug if necessary.

 

In the end I’d rather use MVC.  In fact I created the exact same result (except as a CSV file) using MVC and LINQ.  It was a snap and I had it done in 5 minutes.  Unfortunately the current site is on a Windows 2000 server and makes heavy use of Web Forms and the SSRS viewer controls.  I know I could have simply had the link request the result from an MVC app on another server but I wanted to keep this all fairly consistent and coherent.

Look for me to change this all up next year when I’ve moved our department webpage entirely over to MVC.

Technorati Tags: ,,