Map IP Address to a Geographical Location

Here’s a great article on how to get a geographic location from an IP address within SQL Server:
http://www.sqlservercentral.com/articles/SQL+Server/67215/

The article is very easy to follow and gives great direction of setting up a user defined function in SQL to give you back a location based on the IP address.  Since SQL servers are very good at processing data quickly this seems like a natural way to get this information.  Once you have the function set up you can easily use it in asynchronous processes like analyzing logs, post processing of customer data, etc.  You can also set this up as a trigger within SQL or a service callable by outside apps, such as a webapp.

I’ve seen this used in a lot of ways that I don’t care for (thanks for letting me know about all the fictional hot girls that live in my area, but I don’t think my wife would approve :)) but there are some legitimate ideas coming around.  For instance, let me power up my iPhone and see what Nerd Dinners are available in my area (work in progress).

Another scenario is blocking spam.  For instance, at my work we service Riverside County in southern California, USA.  We have methods to stop unauthorized users from creating accounts and blocking spam to our Wiki’s and such.  But why not use location based blocks as well?  I know my users are all from Riverside County, so why not block everyone from, say, outside of southern California?  While a user or two may be blocked while attempting to access their work from their vacation in Maui, I don’t think I’d get that much flack from blocking these edge cases.

Serving Dynamic SSRS Excel Formatted Documents Via a MemoryStream

On an old Web Forms app we had the request to allow users to download the data from a formatted SSRS report as an Excel spreadsheet.  They currently view the formatted report using the ReportViewerControl, however, when you use the export feature it exports it to Excel with all the formatting of the report, including groups.  This is fairly unusable.  What you really need is a simple spreadsheet with a cell for every value without any grouping.

So, I created a report that had no grouping or formatting and a column for every field of data. 

I could have forced the user to view this ugly report in the ReportViewer and then export it as an Excel spreadsheet but I wanted them to be able to click on a link and get the report directly.  That’s easy enough because you can get an SSRS report formatted as Excel by appending “rs:Format=Excel” to the end of the report url.  This doesn’t work for us, however, for two reasons:

1) The website is accessible to users outside our network and the SSRS server is not.

2) The report retrieves sensitive data filtered by a parameter.  It would be fairly easy for a user to change the parameters in the url to obtain data they shouldn’t be viewing.

In the ReportViewerControl we change the parameters on the server side so the user simply views the generated report with no option to change the parameters.  Now, I needed a way to let them download the Excel version with the same restrictions.

Below is the solution I used with the help of several different other sites on the web.

I created a hanlder that would create the url with the proper parameters and formatting, retrieve the report and then send it out to the user as an Excel document. This way when the user clicked the link for Excel version of the report their browser would open a dialogue to Open or Save the Excel file.  Works like a charm.

 

Here is the code:

1: Public Sub ProcessRequest( ByVal context As HttpContext) Implements IHttpHandler.ProcessRequest
2: context.Response.Clear()
3: context.Response.BufferOutput = True
4: context.Response.ClearContent()
5: context.Response.ClearHeaders()
6:
7: Dim uri As String = ” http://san-destiny/ReportServer?%2fSELPA+Reports%2fDistrict+Errors+Detail+CSV+-+CASEMIS&rs:Format=Excel&DistrictName= ” & Utility.GetDistrictNameForUser()
8:
9: Dim request As HttpWebRequest = HttpWebRequest.Create(uri)
10: request.Credentials = CredentialCache.DefaultNetworkCredentials
11:
12: Dim response As WebResponse = request.GetResponse()
13: Dim responseStream As System.IO.Stream = request.GetResponse.GetResponseStream()
14:
15: Dim buffer(4096) As Byte , blockSize As Integer
16: Dim tempStream As New MemoryStream
17: Do
18: blockSize = responseStream.Read(buffer, 0, 4096)
19: If blockSize > 0 Then tempStream.Write(buffer, 0, blockSize)
20: Loop While blockSize > 0
21:
22: context.Response.AddHeader(” Content-Type “, ” application/xls “)
23: context.Response.ContentType = ” application/xls
24: context.Response.AppendHeader(” Content-disposition “, ” attachment;filename=CASEMISErrorReport.xls “)
25: context.Response.BinaryWrite(tempStream.ToArray())
26:
27: context.Response.Flush()
28: context.Response. End ()
29: End Sub

 

The first few lines simply clear any headers or content that may be initially set.  This is mandatory when sending files because we are going to be setting the headers later.

Line 7 is the url of our report.  Notice that “rs:Format=Excel” is in the url letting SSRS know we want this report as an Excel doc.  Also notice that I set the DistrictName parameter by getting the logged in user’s district name via a utility method.

Line 8 sets up an HttpWebRequest object so that we can make a call to the SSRS server just as you would with a browser. 

Since the SSRS server is on our domain and user restrictions are in place Line 9 sets the credentials we need. Our webapp impersonates a user with the correct access so that we can have the proper security restrictions in place yet allow non-domain users access to the data.  This impersonation is set up in the web.config and DefaultNetworkCredentials looks their first to get the credential information.

Lines 12 & 13 simply grab the web response and sets up a Stream object to read it with.

Lines 15 – 20 sets up a temporary MemoryStream object which holds the bytes read from the response (remember SSRS is sending us an Excel file, which is a binary file, not plain text).

Lines 22 – 28 finally sets up the headers and content type, sends the bytes and closes the stream.

 

I probably didn’t need a MemoryStream object.  In the loop I probably could have written the bytes directly to the context.Response object.  But this seems a little easier to read and potentially debug if necessary.

 

In the end I’d rather use MVC.  In fact I created the exact same result (except as a CSV file) using MVC and LINQ.  It was a snap and I had it done in 5 minutes.  Unfortunately the current site is on a Windows 2000 server and makes heavy use of Web Forms and the SSRS viewer controls.  I know I could have simply had the link request the result from an MVC app on another server but I wanted to keep this all fairly consistent and coherent.

Look for me to change this all up next year when I’ve moved our department webpage entirely over to MVC.

Technorati Tags: ,,

My MIX09 Schedule

Well, soon I’ll be off to MIX09!

I can’t wait to go, but already we’re off to a rocky start.  The first Keynote starts at 9am with registration and the breakfast anytime before that.  That’s not bad but I probably won’t even show up in Vegas until 2am that morning to say nothing of checking in and finally falling asleep.  This is because we have our Inland Empire .Net User Group the night before.  Tom Opgenorth is presenting on ASP.Net MVC.  He is a great speaker and this is a great topic.  Tom is the guy who got me back into TDD and, after playing with MVC for about 6 months, I’m excited to ask him some detailed questions.  If you missed his presentation on TDD you’re in luck.  We recorded it and you can view the entire session on the user group website in the Videos section.  So anyway, the meeting will end about 9:30, so by the time I pack up, hit Starbucks and head out I’ll probably be pulling into Vegas at ~2am.  Rough day when you want to be conscious during the keynote.

Anyway, before all that we had a great plan.  About 8 of us were going to leave the meeting together and caravan to Vegas.  Now, through various schedule changes and some bosses reneging on their approval we’re down to 5 in the caravan.  No big but it kind of lets the steam out when your “party group” gets cut in half.  But, those of us going are still jazzed and ready to hit the town.

To add to the misery my kids have been pretty sick.  It’s the saddest thing to see your two year old just absolutely miserable with a cold. :(  Anyway, he has graciously let me participate in that I’m starting to come down with it.  I hope I am not full blown sick in Vegas. 🙁

OK, on to the real topic (about time!):

So, as I am architecting a site for my new department in MVC I am hitting most of the MVC talks.  Silverlight 3 is probably one of the main focuses of the event along with Windows Azure.  However, I have no actual business use for Silverlight 3 yet (mainly due to the technical restrictions hampering Silverlight’s adoption for my clients) so that’s just eye candy that I don’t have time for.  Also, most of the Azure talks are fairly elementary, rehashing the same “What is Azure” topics.  I’ve seen most of this stuff and, again, have no real practical use at my paying job for it so I’ll skip most of these as well.  I am checking out the Azure Storage because I’m interested in queues, however, since I’m leaning towards SDS I could easily find myself in another session. 

That leaves me with MVC.  Prior to my move to a new department I have been working with MVC for the last 6 months on a particular project, which coincidentally was for my new department.  Now that I’m in my new position this task has grown.  The entire department site will be hosted in MVC with several sub-apps based on various technologies, including MVC.  So, naturally I’m hitting most of these.

I’m the dev guy who puts a high emphasis on “usability” but wish could design better.  So, for Friday afternoon I’m not sure whether to go for the “Advance Your Design with UX Design Patterns" session (quince looks like a great tool!) or the mini sessions on Infographics and Microformats.  I don’t care about the “UI discussion” much so I’ll probably hit the UX Design Patterns session at 12:30 and then skip out to see 12:55, or I may just stay for the whole thing.  Whatever I miss I can catch later from the online recordings.

Here’s my schedule for those of you who happen to be interested:

Day 1 – Wednesday

Time Activity
9:00 AM Keynote – Bill Buxton & Scott Guthrie
11:30 AM RESTful Services for the Programmable Web with WCF – Ron Jacobs
12:45 PM Lunch
2:15 PM ??? – Not sure what I want to go to yet.  Have an idea? 🙂
4:00 PM How’d they do it? Real App. Real Code. Two Weeks. Nothing but .NET – Scott Hanselman
6:00 PM Attendee Party at TAO Las Vegas

Day 2 – Thursday

Time Activity
9:00 AM Keynote – Deborah Adler & Dean Hachamovitch
10:30 AM Windows Azure Storage – Brad Calder
11:45 PM Lunch
1:00 PM Securing Web Applications – Eric Lawrence
2:30 PM File|New -> Company: Creating NerdDinner.com with Microsoft ASP.NET Model View Controller (MVC) – Scott Hanselman
4:15 PM ASP.NET MVC: America’s Next Top Model View Controller Framework – Phil Haack
6:00 PM Movie Screening: Objectified

Day 3 – Friday

Time Activity
9:00 AM Microsoft ASP.NET Model View Controller (MVC): Ninja on Fire Black Belt Tips – Phil Haack
10:45 AM There’s a Little Scripter In All of Us – Building a Web App For the Masses – Rob Conery
12:00 PM Lunch
12:30 PM Advance Your Design with UX Design Patterns – Ambrose Little
12:30 PM User Experience Design for Non-Designers – Shawn Konopinsky
12:55 PM Effective Infographics with Interactivity – Joshua Allen
1:20 PM Oomph: A Microformat Toolkit – Tim Aidlin
2:00 PM Building High Performance Web Applications and Sites – John Hrvatin

 

Technorati Tags:

User Group Videos Now Available!

Well, after a very long labor of love videos of the Inland Empire .Net User Group sessions are now available.

Check out the first two at http://www.iedotnetug.org/UG/videos.aspx.

This has been a long time coming.  There is so much great knowledge and content at our user groups that it’s a real shame that it wasn’t getting recorded.  If you missed a meeting you really missed out.  And if you wanted to brush up on a topic you attended several months ago all you had were your notes, slides or an email contact.  These are great assets but nothing like having access to the actual live presentation.

So, I approached James Johnson (the president of the group) about recording the sessions.  He loved the idea.  We got a VGAUSB2 VGA frame grabber sponsored from epiphan.  (I’m still working on proper sponsor recognition for our videos).  The frame grabber allows us to record the actual live desktop content without installing software on the presenter’s machine.

My dream is having a combination of live video, live screen capture and live audio.  It worked great for the TDD session with Tom Opgenorth.  Unfortunately it took me hours and hours cutting the footage together with nice dissolves and such.  Also my camera is pretty poor on it’s low light ability.  So, for now we’ve abandoned live video.  I think it really adds a lot so getting this back is in the works.

Now we’ve also stepped up to a wireless lapel mic for our speakers.  It’s a major improvement in sound quality, which you can hear in the two videos.  The TDD presentation is just a USB nearby mic where as the Virtual presentation is with the wireless lapel.

So, what are the next steps?  These are not in any particular order.

  • The user group site is moving to Sitefinity from Telerik.  So, getting this same page available in SF is next and already 99% done.
  • Create a proper intro branding video (like 10 seconds) representing the user group
  • Create a proper sponsorship branding video (like 3-5 seconds) for epiphan
  • Get a decent low-light HD camera for live video.
  • Create a new layout template accommodating live video and screen capture so I don’t have to actually edit the result.  Much like the PDC videos.  I really like that layout and it’s minimal setup/maintenance.
  • Take over the world! 🙂

That last one is jokingly but I’m actually drafting plans to setup a SoCal org that facilitates recording of presentations at local user groups and conferences and making them available for public viewing.  check out my post here for the beginning of this concept.

Technorati Tags:

Instant Marketability of our Skills

I have a friend who used to be a developer for a company.  Unfortunately due to these times we live in he is now looking for work.  He’s doing the resume/networking game but it’s still a tough world.

This got me thinking, how instantly marketable are our skills?  What I mean by this is how readily can we market ourselves and possibly jump projects at a moments notice delivering quick results? 

The last time I went to look for a job it took a significant amount of time to assess my skills, update my resume and start looking for work.  Usually I’d have to draft three or more resumes depending on the job I was applying for, such as a .Net developer, a dba, report designer, network engineer, etc.  Next, getting a “portfolio” of my work together took a lot of time as well.  I hadn’t properly kept track of projects and my accomplishments so I didn’t have much to show once potential employers asked me for details beyond my resume.

In most cases I’d say we need to market our abilities as quick as possible.  This not only serves as a way for companies to quickly assess who we are and what our skills are, it also helps you to get contract jobs to pay the bills.  Contract jobs could be single day jobs or more formal affairs for several weeks or months, however, these can take a long time to procure.  This is where contracting agencies come in to play.  Quick paying gigs can be found from freelancer sites like guru.com, RentACoder.com, etc or classified ad sites like CriagsList.org.

The more “instantly marketable” we are and can deliver results the better chance we have of finding permanent work as well as side jobs keeping the paychecks coming in.  Plus, once you have several completed projects under your belt you have greatly increased your networking pool as well as your project portfolio.

As a comparison my neighbor has been in construction for years and manages crews on large projects.  If he were to loose his job he has instantly marketable skills.  He can be available at a moments notice to build or fix almost anything within his field.  If need be, he could even stand on a street corner hoping to jump on a “day” crew.  Now, I’m a pretty decent handy man but I don’t know anything about pouring concrete, running compliant building electrical or driving a Bobcat.  His skills are instantly marketable and he can deliver results instantly.

What I mean by quick or instant results are being able to complete some task within a day.  While my neighbor may be used to year long contracts putting up a complex of buildings he can instantly market his skills and provide a day long project for an impromptu customer such as a backyard shed or a new sprinkler irrigation system.  I’ll call these “day jobs”. 

In comparison we, as software developers, have some great advantages and some great disadvantages, many of which are self-imposed. 

On one hand we have almost no commodity overhead.  For an irrigation job my neighbor may have to go to the nearest hardware store to purchase pvc pipe, joints, sprinklers, a timer, etc.  If I’m doing a web project a quick few clicks will get me a site with a database and almost anything else I need.  I don’t even have to leave my chair!

However, my neighbor knows how to design, purchase parts and put a sprinkler system in less than a day.  Do I know how to spec, design and deliver a software dev project in less than a day?  Most of the time I’d say no.  Even “simple” projects I’m used to have defined stages such as gathering requirements, establish timelines and priorities, architect backend, architect db, design UI, draft implementation, customer review, back to architect for bugs/features/next milestone, rinse and repeat.  Even just the requirements/architect phase can take several days.

Granted, most dev projects are not one day sprinkler systems, but how do we find/get the one day jobs?

What is a software “day job”?  I would akin these to ads for work like “Make changes to website”, “Add textbox to customer info screen”, “fix CSS issue”, etc.  These are often quick jobs requested by companies or individuals that do not have the expertise in house.  Many times they have purchased a website from a company and they need changes.  For whatever reason they are farming the work out rather than going back to their original designer.  These may also be dev companies that have purchased tools but now are running into roadblocks, like a website company just getting into AJAX.  Depending on your expertise they could even be complex projects that you, with your skills, could solve in a day like “Add Paypal payment option to shopping cart”, “Add contact database to website”, “update SharePoint webpart”, “Add PDF email capabilities to SSRS reports”.

Note, common sense will need to be used when looking into these types of jobs.  If you see something like “Need a site similar to ebay to auction off xyz.  Shouldn’t take very long” you can tell that a) this is not a “day job” and b) the customer has no idea what they are asking for.  Avoid these like the plague.  The key to day jobs are getting requirements that can be easily defined and met and working with customers that know exactly what they want, even if they can’t articulate it well the first time.  Jobs like “fix this xyz css error on IE7” is fairly easy to define, test and establish as a guideline for completion.  The customer obviously knows what they want.

First of all, why would we want day jobs?  Here are my quick reasons:

  • Day jobs are “easy”.  After all they can be done in a day.  Do it and move on.
  • Requirements are fairly simple. They usually contain one or two definable goals (notice I did not say well defined. :))
    As I said above commonsense will need to be exercised.  You want to find clients that are pretty sure of what they want and requirements that are easy to define.
  • Pay is quick.  Working through a freelance site or through Craigslist you can often get paid upon deliverable.  No waiting for the next payroll cycle.
    A quick note.  Make sure you exercise all caution.  Using a freelance site like guru.com allows you to use a broker for all funds and offers resources for dispute management.  If you do a job on Craigslist it is fairly easy for a customer to stiff you (and you them) so don’t enter into these without caution.
  • Little ongoing support.  You aren’t developing the next Face Book.  If you’re just adding a JavaScript alert box, once it works it works (assuming this has been tested to work in the defined requirements).  If they change their site and your fix breaks or is no longer relevant this is a separate and unrelated job.  If their web server goes down you’re not the one they call.
  • Chances for rehire/reputation.  Once you do a few quick jobs for a company they may be interested in giving you more long term projects.  These may be “ideas” that they’ve bounced around but have never had the resources to accomplish.  Now with you they have a reliable programmer that they can contact for these things.  Freelancer sites often also have reputations and ratings.  When you do gigs you are allowed to rate your clients and they are allowed to rate your work.  Having a reputation for good work and quick results may get you on the radar for other clients and larger contracts.  It also helps your chances to get a “sale”.

For your consideration/criticism here are my current tactics for my instant marketability.

Note that I currently have a job, so I don’t often go looking for these day jobs.  But on the other hand, when I do look for side work day jobs are the majority of what I look for.

  1. Establish a single point of reference for marketing yourself
    This will most likely be a website with your development “portfolio”.  Having too many sites make it difficult for companies to find all your information.  My site, http://MattPenner.info, is for this exact purpose.  I have my tech blog as well as pages detailing my speaking engagements, resume PDF, experience, projects, etc.  Granted most of it isn’t set up yet.  I’m about 6 months into this.  :)  As I said, I have a job.  That lack of urgency along with two little boys and a girl on the way mean that my site isn’t moving as fast as I’d like.
    However, the grand scheme is to create a single location that anyone can see what I am intersted in, what my skills are, what projects I’ve worked on and, hopefully, the quality of those projects.  I envision having my resume completely built out as a wiki.  I hate the current restriction on paper based resumes, but that is a necessary evil.  If I can get prospective clients/employers to look at my wiki resume I plan to have everything about me in there.  If they don’t care about my work with .Net then they can skip that part.  If they like my work on Crystal Reports they can click deeper to find out the nitty gritty details along with screenshots.
    That’s the plan anyway. 🙂
  2. Put marketable information on your site
    What I mean by this is make sure your site shows exactly what you can do.  Don’t just have a blog about your interests.  Have your current resume along with any details of jobs you’ve done.  If you are a part of any community groups (user groups, open source communities) list these.  If possible list examples of code and/or screenshots of your work.  This offers very tangible details on your skills that simply cannot be duplicated in a paper resume.  If you fixed a bug show the code.  If you developed a great jquery plugin show the code and a screenshot.  Better yet, show a working demo!
  3. Find really quick jobs that you know you can do
    If you are an expert in CSS then find a very quick CSS job.  I don’t care if it’s boring.  If it’s a small job and you’re good at it you will quickly deliver a high quality product to a customer.  There’s nothing like a customer saying, “Wow, this works great!  I never expected it so fast!”
  4. Don’t look for something you’d “like” to do
    I highly stress this fact.  This is really a corollary of the point above.  If you start getting into jobs you aren’t 100% comfortable in then you start loosing time on things like research, weird bugs, etc which can possibly result in delayed deliverables, or worse yet, faulty products.  Even if you do deliver a high quality product chances are you spent more time than it was worth.  You could save the world, but if it took you 1 week, the client had 2 days, and you only got $200 was it worth it? 
    Once you have established a decent set of jobs, a reputation, and have time to play you can go after these jobs.  Until then, consider these off limits.
  5. Deliver a live working solution with your bid
    Often the client has an ad out for the job because they can’t do it themselves and, quite likely, has gone through grief with their current developer trying to make it work.  I did this on my last two bids.  I took the requirements (again, find those easy to define requirements) and developed a complete solution in a few hours.  I then posted my bid for the project showing the client that it was already done.
    Talk about a quick and high quality deliverable!  The client could see it was done to their specification and instantly deliverable!
    As an example one of the projects was the layout and sorting of a data table for an aspx page.  I recreated their UI, added my changes to the backend with fake database data, and hosted the site.  The client was able to freely play with a live sample.
    One caution, if this takes you too long you may miss out on the bid if the client selects someone else.  If you feel this might be the case it would be best to bid first, work on it, and then amend your bid with the deliverable.  That way you at least get your name in the pot of potential vendors.
  6. Don’t give away the farm.  My last bid for a day job required a change to a visual effect in the SpryEffects JavaScript library that is available Dreamweaver CS3.  If I delivered a fully usable product with my bid their browser would download my JavaScript file to their computer upon loading the webpage.  If they were of any technical ability they could just plug this file into their code and be on their merry way.  I would have given them a free working solution and I’d have no money to show for it.
    So instead I used a screen capturing utility to record me demoing the fix in FireFox 3 and IE7.  I then converted the video to Flash, uploaded it to my website and sent in my bid with a reference to the video demo of a fully working product.

I’d love to hear comments from others on this and ideas they may have.

Goodbye to Code Camp 2009

Well, unfortunately I can’t stay for the remaining sessions.  It’s my mother’s birthday party tonight and I doubt I’d make many points by missing it.  😉

It’s been a great weekend with lots of topics and networking.  I’m eagerly waiting for San Diego and LA Code Camps.  Hopefully by then we’ll have a solution ready for recording the session.

Take care all!

Technorati Tags:

SQL Server Integration Services Control Flows

Bret Stateham is giving this talk.  He is a great presenter.  Very animated and speaks clearly.  He also explains complicated details very easily in a friendly manner.  If you have a chance to hear this guy speak definitely take advantage of that!

Boy I wish I had a recording of this.  I’d love my two DBA colleagues to see this.  All but two of our SQL servers are 2000, simply because that’s the latest available when the systems were installed.  It works great but you don’t have access to the great tools, such as SSIS, in 2005 or 2008.  I’ve pinged Lynn Langit and Daniel Egan about possibly being a resource for recording these sessions in the future.  You can read about the initial details in my blog post here.  <update>OK, here’s an update.  I’m 45 minutes into the session and Bret is hilarious!  He definitely makes this topic a lot of fun by his passion and animated style.  Lots o’ fun!</update>

This is really an intro to SSIS, which is exactly what I need.  His Data Flow Task talk is the next session. 

One data source we’re looking to need is an SFTP site.  Currently this isn’t in SSIS but we do have a vendor’s connector from n Software.  This will be fun to try out.  If for some reason it doesn’t work out Bret said we can create one through the Script Task or even a formal task developed in C#.  I may end up doing something like this to hit our GIS server using REST.  I did see a Web Task and it’s quite possible that it can consume REST already.  This means that we’d have to consume XML and possibly convert this to some other usable format if necessary.

Bret just gave a great tip that may have frustrated me for a while.  The variables you create a scoped.  If you have a task selected and you create a variable, that variable will be scoped to that task, meaning it will only be available to that task and will disappear after that task is finished.  Chances are you want your variable to be accessible by other tasks in your package.  In this case make sure you click an empty area of your design surface prior to creating the variable.  That will scope it to the package itself.

In SSIS 2005 you could only write Scripts in VB .Net, however, in 2008 you have the ability to use C#.

Man!  Debugging in SSIS is awesome!  I can’t way to dive into this stuff at work.

Loops, Sequence Containers and Groups:

Loops are exactly as they sound.  The allow you to take a set of tasks and loop through them, much like a subroutine.

Sequence Containers do not loop.  They do, however, offer scope and environment properties.  You can define variables that are only available the that sequence.  You can also set transaction properties on that individual container.  Maybe you have several different containers each with their own transaction level.  Pretty neat.

Groups are simply a design feature.  They are simply there for organizing the items on your design surface but have no properties and no impact on the package at all.

The For Each Loop allows you to specify the collection your picking items from.  The possibilities include XML objects (for each node) and even SMO (SQL Management Objects) for things like “for each server do this”.  That’s pretty cool.

Bret showed a great example where he needs to process files every day that may not have the same filename.  For instance this may be an AS400 dumping csv files that were time stamped.  He generates what the filename should be in a Script Task, saves that into a variable, and then his File Task uses that variable as it’s connection string.  Sweet.  We need to do this as well and this would really help.

This was a great talk.  SSIS is definitely something I’ve been looking to get into ever since SQL 2005.  It looks like this is something I can dive into right away.  For the more complex things Bret recommended the book Professional SQL Server 2005 Integration Services from Wrox.

Unfortunately I can’t attend Bret’s next session since I’m attending Mike Roth’s Asterisk session.  Mike and I are working on a side project together using Asterisk, so I should give audience support and also learn what the heck we’ll be using! 🙂

 

Technorati Tags: ,

A Journey Through SQL Server Analysis Services

This is given by Ben Aminnia. 

He’s really putting an emphasis on planning, which is great.  Lynn Langit said her new book has the first 5 chapters devoted to what needs to be done before you even open BIDS.

Ben has an interesting presentation style, at least for this talk.  He’s giving it like his company has an existing Analysis Services project that we are interviewing to possibly take over.  I’m not quite sure yet we are on the driving end (we are really interviewing Ben to see if we want to take the job) or on the other end (he is interviewing us to see if we’re qualified to take the job).  I hope more of the former since I’m at the talk because I don’t know anything about SSAS.

While he is doing and interview style here at Code Camp he is assuming we don’t know about SSAS so that’s a good strategy to take.  He just gave a really good 4 minute background on Analysis in general and what a Cube is (since I raised my hand because I didn’t know! :))

During this talk Ben is using the example of a system that monitors disk space availability (among other data).  He actually has a great real world system for doing just this. It is based on the cover story of the February 2008 issue of SQL Server Magazine. You can find the online version here.  It’s a very complex SSIS package but it allowed Ben to do this without having to spend months developing it.  For him it works great and is easy to maintain. Ben has even made his own additions and discussed these changes with the original author. If you contact Ben directly (ben at sql.la) he can give you his additions.

Great talk!

Technorati Tags: ,

Recorded sessions from Code Camp?

I record the monthly presentations at the Inland Empire .Net Users Group.  I’m half way done developing the distribution area for our groups website.  This is taking a while because I have no free time!  J/K, I’m also guilty of putting in the kitchen sink.  I’m creating it so that users will have a single place for all session content including a Silverlight player for video, access to downloads such as slides and code as well as ratings and content.  Will be nice, if I ever get it done!

Anyway, back to the topic.  Is anyone interested in recorded sessions of Code Camp?  I know I sure am.  There are so many sessions by so many great speakers that it is impossible to see them all.  Why not record them and make them available after the show?

I’ve volunteered to Lynn Langit and Daniel Egan, who I know have vested interest in Code Camp, to do the work.  I just need funding for the resources.  It’s not expensive, but it’s not trivial either.  The setup I currently use for our IE .Net sessions is a VGA frame grabber from Epiphan (great equipment!), a wireless mic and my laptop.  That’s it.  Turns out when I went to the ESRI User Conference in San Diego they do the exact same thing, although they use Macs and a few mixers since they also have PA systems for the presenters.  They sell their week’s worth of presentation recordings on DVD’s for ~$400.  I’d like to make this content freely available for viewing on the web much like PDC.

This would be relatively easy.  I could get a few bodies to help set up and keep things running smoothly.

Code Camp 2009 has at most 9 simultaneous sessions.  Assuming this doesn’t grow I’d need to purchase and put together 9 recording “kits”.  If we went the “inexpensive” route, without any vendors kicking in free or discounted gear, we could probably build a kit for a little less than $1,000.  This would be for the frame grabber (VGA2USB model), a decent but inexpensive laptop, wireless mic and miscellaneous cabling gear.  We’d get 5-10 frames per second at 1024×768, which for 95% of the presentations would be adequate.  So, we’re looking at the need to finance ~$10k of equipment allowing for one spare.  Any takers? 🙂

Technorati Tags: ,

Code Camp 2009 – Day 1 Over

Well, this is definitely a blast.  Getting a lot of great info and great contacts.  Unfortunately I wasn’t able to attend the Geek Dinner or any other after party.  My niece is moving to North Carolina and tonight was her going away party. <sniff!> I guess that’s worth missing the Geek Dinner for. 🙂

Tomorrow’s another fun filled day!

Technorati Tags: