Google Cloud Print – Neat idea

I just read this article on Mashable:

Google Cloud Print Reveals the Future of Printing

 

This could eventually be combined with Google maps and location-aware apps.  Then, if I am on my mobile device and I need to print a document quickly I can pull up a map and find all the nearest public printers that I could use, such as from Kinko’s, Staples or other office type stores near my current location or my destination. 

 

That would be sweet.  I’m on my iPhone, get a document from a client that needs to be signed, I review it on my iPhone and then print the signature page near our meeting.  I pick it up, sign it and deliver it on the spot.  Cool!

Metroid Prime Trilogy on the Wii – What a fun game!

I don’t know if you ever played any of the Metroid games on the classic Nintendo.  I did and loved them. image

I just picked up the new Metroid Prime Trilogy for the Wii.   It’s actually the first two trilogy games that came out on the Nintendo GameCube and the final chapter that came out on the Wii.  The GameCube versions were revamped to use the Wii controllers.

I have to say, this is one of the most fun games I’ve played in a long time.  I’ve played a few first person shooters on the XBOX or PS2 but the controls just haven’t been nearly as comfortable as my keyboard and mouse that I’m used to.  I still haven’t tried The Force Unleashed on the Wii but with Metroid I feel like this is the first time that they have gotten it right.  You use the nunchuck for the movement and use the Wii remote as your targeting and viewing.  Buttons on both controllers facilitate the myriad of actions you can perform.  Honestly, this setup is more natural and fun than any I have ever used on a console or computer.

The storyline game play are really a lot of fun.  If you’re a fan from the older series then seeing familiar characters and hearing the music will bring back a lot of fond memories.  Sometimes when 2D games get brought into 3D it’s done in a very kiddie or cheesy way, or sometimes just doesn’t translate well at all.  However, seeing my old 2D side-scrolling nemeses in full blown 3D moving around in live space is just awesome.  They aren’t bloated cartoony characters (like many of the Nintendo games turn out to be) but are realistic representations of what you would expect to see.  The scenery is great and the worlds are really well designed.  Between the game play, world design, storyline, attention to detail and music this is really a top-notch game you can get totally engrossed in.  I’ll start playing at 9pm and Eva finally calls me to bed at 11pm without me even realizing what time it is. 

I’m still only at the beginning of the first game.  For $50 it’s a pretty standard price for a Wii game but since there are three in the pack I really feel like I got my money’s worth.  Like I said, I’m still on the first chapter, which is really a GameCube game but it’s still pretty awesome.  I can’t wait to make it to the final third chapter that was developed for the Wii from the ground up. 

This is just scary. T-SQL with “shapes”

I was reading one quick article on SQL and code formatting.  Now that SQL (2005+ I believe) allows for international characters they gave a pretty scary example of what can be done:

CREATE TABLE "╚╦╩╗" ( "└┬┴┐" nvarchar(10))

DECLARE @ nvarchar(10) set @=‘═’

INSERT INTO "╚╦╩╗"

    ( "└┬┴┐" )

SELECT replicate(@,5)

SELECT *

FROM"╚╦╩╗"

DROP TABLE "╚╦╩╗"

This is no joke.  I just did it and it worked great.  :-0

Converting my entire production environment to VHDs

Ever since virtual PC type systems have been out, long has been the dream to run any type of operating system you want, multiple operating systems at the same time, all working independently of each other.

However, only in the last year or two has this really become a reality with the latest hardware supporting virtual environments natively.  Now Windows 7 lets you boot off of a VHD as if it was a standard hard drive making full use of your actual hardware devices.

After attending Stephen Rose’s presentation on Virtualization 101 for Developers my eyes were suddenly open to the possibilities of what could be done.  I could move my entire development environment to a virtual PC image.  This would allow me to copy that image at any time allowing me to test beta products, new upgrades, etc without worrying about damaging my current environment.  Many times I have upgraded to new releases (or betas) that caused previous programs to stop running, Dlls required in legacy projects to be removed, etc.  In some rare cases (SQL Server 2008 RC2) I had to reload my entire environment because I could never get my machine back to its original state.

Now that Windows 7 allows you to boot from a VHD my wheels are suddenly turning.  I could move my entire computer use (not just development) to VHDs.  This would allow me to have a base image for general tasks like email, MS Office, etc that my wife and I could use for general day to day tasks.  I could have another environment for all my video editing work, which is especially useful since video compression codecs can sometimes cause havoc.  I could have a production development environment and as many beta environments as I want.  If I want to download a 30 day trial of any product I just create a copy of the appropriate VHD, boot off of that and try out the product.  After 30 days if I don’t like it I just delete the testing VHD and it will have never touched my production VHD.  If I do like the product I just delete the testing VHD, purchase the full copy and install it to my production VHD.  Very nice and clean.

I just purchased a new HP dv7 laptop with the Intel core i7.  So, this is the perfect time to get started.  While it is possible to wipe the drive clean and use VHDs without any core operating system at all I’ll leave the HP environment intact.  That way I can always fall back to that just in case.  If for some reason my sound stops working I doubt HP will be willing to care if I can’t reproduce it in the standard OS.

So, I’ll document how I set this up and how it works out in future blog posts.  Enjoy the ride!

Netflix and the Extinction of DVDs, et al

It was recently announced that Netflix will add streaming to the Nintendo Wii as one of its capabilities.  They already stream to personal computers, the Microsoft Xbox, Sony Playstation and various small devices.

As we just received a Wii for Christmas I am excited to try this out.

As a technology geek I have long thought out how to move my entire media collection (photos and personal videos, music and DVDs) to a computer in our house that we could watch from any tv or listen on any stereo.  This is already easily done but the funds are a little out of reach for our growing family and it certainly isn’t available for the masses.

That’s where Netflix comes in.  They revolutionized the DVD rental industry when they allowed you to rent through the mail with no late fees.  While I was dubious at first at whether their business model could really turn a profit with that much overhead they have done an incredible job.  I’m really glad they have made it through the roughest part.

When Blockbuster saw the competition Netflix was creating they started their own DVD rent by email service, however, they one upped Netflix by allowing you to return them back to your local store.  With Netflix you had to send your DVDs in before they mailed you your next batch.  With Blockbuster, you could return your DVDs back to any local store and rent another right there.  No waiting.

Around the same time Apple thru iTunes, Amazon and Netflix allowed you to download your movies.  However, this was really confined to users who watched on their computers or had laptops or iPods connected to their TVs.  This was a very small market mostly consisting of consumers with technical knowledge able to afford the equipment or college students.

However, then comes Netflix streaming content to the Xbox, Playstation and now Wii.  Suddenly they have an audience that is already familiar with playing rich content on their own devices.  Console gaming devices are almost as much of a part of any home entertainment system as is a DVD player or stereo system.  These days connecting gaming systems to the Internet is a simple task. 

In my opinion, Netflix is the leader in bringing true streaming movie content to the general masses.  While cable and satellite companies have been offering this for years it just hasn’t really caught on.  In talking with my friends, we all have cable or satellite offering the feature, but we just don’t use it.  I don’t know if it is the limited available content or the pricing.  But we just don’t use it.

Another incredible thing that Netflix has done is completely change the overhead they are required to keep.  As they move more and more of their customers to streaming content, their assets will be completely digital available anytime and anywhere in an instant.  There will be no need to ship physical and fragile media all over the country amounting to an incredible cost in shipping time, warehouse inventory, staff, etc.

By giving the general media consumer instant access to movies on general devices they already own Netflix is pushing the DVD out to pasture and really making content over the Internet a reality.

There will always be consumers who prefer to own physical media, but the writing on the wall is even more pronounce than ever before.

Roy Osherove’s TDD Code Review Videos

I’m going to be doing a talk on Practical TDD at the Inland Empire .Net Users’ Group in the near future.  We’ve had TDD talks in the past but they have been more overviews with only a little code.  I plan to take the group through craeting a simple blog engine from scratch completely using TDD.  I hope that this will give a real great example of how to develop a project using TDD.

The presentation was supposed to be in August, but as that lands directly on my 8 year wedding anniversary and our son’s 3rd birthday I had to postpone it.  I don’t like sleeping on the couch. 🙂  James Johnson, the president, is still working on where to put my talk, so I’ll post an update when that happens.

Anyway, in the spirit of getting ready for this talk I’ll be writing a few blog posts on some great TDD resources I’ve found.

Recently Roy Osherove, the author of the Art of Unit Testing, started doing actual TDD code reviews on his blog.  These are great!  It’s one thing to read a blogger’s posts with their thoughts and opinions on topics but seeing Roy go through a live code review in a video is just spectacular.  It’s like getting a chance to sit in on a session with Roy.

While he admits that he’s a little harsh on the Nerd Dinner review, having done it at 2am, I think he is spot on in his findings.  Of course, Roy is a very experienced veteran in this area so we’d think nothnig less.  However, the points Roy makes are great and he really brings to light what good and bad examlpes of TDD he sees in the app.

I can’t wait to check out the rest of the videos.

He offers to review code that is sent in.  I’m not sure I’ll be brave enough to do that, but if I’m not that would indicate a smell in my code, wouldn’t it? 🙂 Maybe when I get my talk together I’ll pass my code on to Roy to review prior to my presentation.  That would really give me a great resource to make sure my presentation is sound and a little more authority on this topic.

Map IP Address to a Geographical Location

Here’s a great article on how to get a geographic location from an IP address within SQL Server:
http://www.sqlservercentral.com/articles/SQL+Server/67215/

The article is very easy to follow and gives great direction of setting up a user defined function in SQL to give you back a location based on the IP address.  Since SQL servers are very good at processing data quickly this seems like a natural way to get this information.  Once you have the function set up you can easily use it in asynchronous processes like analyzing logs, post processing of customer data, etc.  You can also set this up as a trigger within SQL or a service callable by outside apps, such as a webapp.

I’ve seen this used in a lot of ways that I don’t care for (thanks for letting me know about all the fictional hot girls that live in my area, but I don’t think my wife would approve :)) but there are some legitimate ideas coming around.  For instance, let me power up my iPhone and see what Nerd Dinners are available in my area (work in progress).

Another scenario is blocking spam.  For instance, at my work we service Riverside County in southern California, USA.  We have methods to stop unauthorized users from creating accounts and blocking spam to our Wiki’s and such.  But why not use location based blocks as well?  I know my users are all from Riverside County, so why not block everyone from, say, outside of southern California?  While a user or two may be blocked while attempting to access their work from their vacation in Maui, I don’t think I’d get that much flack from blocking these edge cases.

Serving Dynamic SSRS Excel Formatted Documents Via a MemoryStream

On an old Web Forms app we had the request to allow users to download the data from a formatted SSRS report as an Excel spreadsheet.  They currently view the formatted report using the ReportViewerControl, however, when you use the export feature it exports it to Excel with all the formatting of the report, including groups.  This is fairly unusable.  What you really need is a simple spreadsheet with a cell for every value without any grouping.

So, I created a report that had no grouping or formatting and a column for every field of data. 

I could have forced the user to view this ugly report in the ReportViewer and then export it as an Excel spreadsheet but I wanted them to be able to click on a link and get the report directly.  That’s easy enough because you can get an SSRS report formatted as Excel by appending “rs:Format=Excel” to the end of the report url.  This doesn’t work for us, however, for two reasons:

1) The website is accessible to users outside our network and the SSRS server is not.

2) The report retrieves sensitive data filtered by a parameter.  It would be fairly easy for a user to change the parameters in the url to obtain data they shouldn’t be viewing.

In the ReportViewerControl we change the parameters on the server side so the user simply views the generated report with no option to change the parameters.  Now, I needed a way to let them download the Excel version with the same restrictions.

Below is the solution I used with the help of several different other sites on the web.

I created a hanlder that would create the url with the proper parameters and formatting, retrieve the report and then send it out to the user as an Excel document. This way when the user clicked the link for Excel version of the report their browser would open a dialogue to Open or Save the Excel file.  Works like a charm.

 

Here is the code:

1: Public Sub ProcessRequest( ByVal context As HttpContext) Implements IHttpHandler.ProcessRequest
2: context.Response.Clear()
3: context.Response.BufferOutput = True
4: context.Response.ClearContent()
5: context.Response.ClearHeaders()
6:
7: Dim uri As String = ” http://san-destiny/ReportServer?%2fSELPA+Reports%2fDistrict+Errors+Detail+CSV+-+CASEMIS&rs:Format=Excel&DistrictName= ” & Utility.GetDistrictNameForUser()
8:
9: Dim request As HttpWebRequest = HttpWebRequest.Create(uri)
10: request.Credentials = CredentialCache.DefaultNetworkCredentials
11:
12: Dim response As WebResponse = request.GetResponse()
13: Dim responseStream As System.IO.Stream = request.GetResponse.GetResponseStream()
14:
15: Dim buffer(4096) As Byte , blockSize As Integer
16: Dim tempStream As New MemoryStream
17: Do
18: blockSize = responseStream.Read(buffer, 0, 4096)
19: If blockSize > 0 Then tempStream.Write(buffer, 0, blockSize)
20: Loop While blockSize > 0
21:
22: context.Response.AddHeader(” Content-Type “, ” application/xls “)
23: context.Response.ContentType = ” application/xls
24: context.Response.AppendHeader(” Content-disposition “, ” attachment;filename=CASEMISErrorReport.xls “)
25: context.Response.BinaryWrite(tempStream.ToArray())
26:
27: context.Response.Flush()
28: context.Response. End ()
29: End Sub

 

The first few lines simply clear any headers or content that may be initially set.  This is mandatory when sending files because we are going to be setting the headers later.

Line 7 is the url of our report.  Notice that “rs:Format=Excel” is in the url letting SSRS know we want this report as an Excel doc.  Also notice that I set the DistrictName parameter by getting the logged in user’s district name via a utility method.

Line 8 sets up an HttpWebRequest object so that we can make a call to the SSRS server just as you would with a browser. 

Since the SSRS server is on our domain and user restrictions are in place Line 9 sets the credentials we need. Our webapp impersonates a user with the correct access so that we can have the proper security restrictions in place yet allow non-domain users access to the data.  This impersonation is set up in the web.config and DefaultNetworkCredentials looks their first to get the credential information.

Lines 12 & 13 simply grab the web response and sets up a Stream object to read it with.

Lines 15 – 20 sets up a temporary MemoryStream object which holds the bytes read from the response (remember SSRS is sending us an Excel file, which is a binary file, not plain text).

Lines 22 – 28 finally sets up the headers and content type, sends the bytes and closes the stream.

 

I probably didn’t need a MemoryStream object.  In the loop I probably could have written the bytes directly to the context.Response object.  But this seems a little easier to read and potentially debug if necessary.

 

In the end I’d rather use MVC.  In fact I created the exact same result (except as a CSV file) using MVC and LINQ.  It was a snap and I had it done in 5 minutes.  Unfortunately the current site is on a Windows 2000 server and makes heavy use of Web Forms and the SSRS viewer controls.  I know I could have simply had the link request the result from an MVC app on another server but I wanted to keep this all fairly consistent and coherent.

Look for me to change this all up next year when I’ve moved our department webpage entirely over to MVC.

Technorati Tags: ,,

User Group Videos Now Available!

Well, after a very long labor of love videos of the Inland Empire .Net User Group sessions are now available.

Check out the first two at http://www.iedotnetug.org/UG/videos.aspx.

This has been a long time coming.  There is so much great knowledge and content at our user groups that it’s a real shame that it wasn’t getting recorded.  If you missed a meeting you really missed out.  And if you wanted to brush up on a topic you attended several months ago all you had were your notes, slides or an email contact.  These are great assets but nothing like having access to the actual live presentation.

So, I approached James Johnson (the president of the group) about recording the sessions.  He loved the idea.  We got a VGAUSB2 VGA frame grabber sponsored from epiphan.  (I’m still working on proper sponsor recognition for our videos).  The frame grabber allows us to record the actual live desktop content without installing software on the presenter’s machine.

My dream is having a combination of live video, live screen capture and live audio.  It worked great for the TDD session with Tom Opgenorth.  Unfortunately it took me hours and hours cutting the footage together with nice dissolves and such.  Also my camera is pretty poor on it’s low light ability.  So, for now we’ve abandoned live video.  I think it really adds a lot so getting this back is in the works.

Now we’ve also stepped up to a wireless lapel mic for our speakers.  It’s a major improvement in sound quality, which you can hear in the two videos.  The TDD presentation is just a USB nearby mic where as the Virtual presentation is with the wireless lapel.

So, what are the next steps?  These are not in any particular order.

  • The user group site is moving to Sitefinity from Telerik.  So, getting this same page available in SF is next and already 99% done.
  • Create a proper intro branding video (like 10 seconds) representing the user group
  • Create a proper sponsorship branding video (like 3-5 seconds) for epiphan
  • Get a decent low-light HD camera for live video.
  • Create a new layout template accommodating live video and screen capture so I don’t have to actually edit the result.  Much like the PDC videos.  I really like that layout and it’s minimal setup/maintenance.
  • Take over the world! 🙂

That last one is jokingly but I’m actually drafting plans to setup a SoCal org that facilitates recording of presentations at local user groups and conferences and making them available for public viewing.  check out my post here for the beginning of this concept.

Technorati Tags:

Copernic Desktop Search – Love it!

About a year ago I was looking for a good desktop search program.  I couldn’t stand the built in search for Windows XP.  I don’t have Vista so I can’t speak to that one at all.

Anyway, what I was looking for was the following:

  • Index my email, including sent items
  • Index folders of my choice, including network folders
  • Track emails or files even after they have been moved
  • Fast to search, like in milliseconds
  • Pretty much invisible when indexing
  • Able to index both the item and the contents for multiple file types

The built-in XP search failed many of the points above.  The search, even with indexing turned on, was very slow, I couldn’t search my email and files at the same time, Outlook’s search was terrible (even when backed by Exchange), I couldn’t easily add additional file types to search, and more.

I love Google so naturally I tried theirs first.  I was pretty disappointed.  I added plug-ins to search for non-standard content such as c# files and such but this seemed difficult and it was hit or miss as to what was found.  Also, I didn’t like the web interface.  I felt that much of the screen space was wasted and it didn’t offer much value in the information it returned.  It was also difficult at the time to determine exactly where and what it was indexing.

So, after searching for other, well, search systems (on Google :)) I found Copernic and decided to give it a go.  It had a lot of good reviews.

Boy do I love it! 

  • It’s quick and extremely quiet, meaning I can’t even tell it’s there until I need it.
  • I can search all my content at once (emails, photos, text documents, etc) or I can selectively search one category, like my email only.
  • Results are near instantaneous.  So fast it searches as I type.
  • It has an auto complete function that searches as I type.  I can type a person’s name or email address into the From box (if I’m searching email) and it shows me a list of valid entries so I really only have to type a few letters
  • It has a preview pane.  95% of the time I don’t even have to open the actual item.  It shows me it in the preview and scrolls to the first hit in the document.  For emails it even shows the attachments so I can open the attachment straight from Copernic rather than having to open the email first
  • Setting indexing of non-standard file types is easy.  I just set the extension and whether or not I want it to index the contents (for instance with .cs files I want the contents but photos I may not)
  • I can search network drives.  I hardly do this but when we have all our documentation on a shared drive on a server this is invaluable

I’m sure there are other wonderful desktop search engines out there but I’m sold on Copernic so I doubt I will be looking for a replacement anytime in the near future.

If you’re sold on it to and looking for an enterprise solution, Copernic even allows enterprise discounts and integrates with Active Directory.  You can set the configuration via Group Policy.  That’s pretty neat when you have a department that is non-technical.  You can set up a department to automatically search files they need and on their shared directories.

Love it!

Technorati Tags: