Goodbye to Code Camp 2009

Well, unfortunately I can’t stay for the remaining sessions.  It’s my mother’s birthday party tonight and I doubt I’d make many points by missing it.  😉

It’s been a great weekend with lots of topics and networking.  I’m eagerly waiting for San Diego and LA Code Camps.  Hopefully by then we’ll have a solution ready for recording the session.

Take care all!

Technorati Tags:

SQL Server Integration Services Control Flows

Bret Stateham is giving this talk.  He is a great presenter.  Very animated and speaks clearly.  He also explains complicated details very easily in a friendly manner.  If you have a chance to hear this guy speak definitely take advantage of that!

Boy I wish I had a recording of this.  I’d love my two DBA colleagues to see this.  All but two of our SQL servers are 2000, simply because that’s the latest available when the systems were installed.  It works great but you don’t have access to the great tools, such as SSIS, in 2005 or 2008.  I’ve pinged Lynn Langit and Daniel Egan about possibly being a resource for recording these sessions in the future.  You can read about the initial details in my blog post here.  <update>OK, here’s an update.  I’m 45 minutes into the session and Bret is hilarious!  He definitely makes this topic a lot of fun by his passion and animated style.  Lots o’ fun!</update>

This is really an intro to SSIS, which is exactly what I need.  His Data Flow Task talk is the next session. 

One data source we’re looking to need is an SFTP site.  Currently this isn’t in SSIS but we do have a vendor’s connector from n Software.  This will be fun to try out.  If for some reason it doesn’t work out Bret said we can create one through the Script Task or even a formal task developed in C#.  I may end up doing something like this to hit our GIS server using REST.  I did see a Web Task and it’s quite possible that it can consume REST already.  This means that we’d have to consume XML and possibly convert this to some other usable format if necessary.

Bret just gave a great tip that may have frustrated me for a while.  The variables you create a scoped.  If you have a task selected and you create a variable, that variable will be scoped to that task, meaning it will only be available to that task and will disappear after that task is finished.  Chances are you want your variable to be accessible by other tasks in your package.  In this case make sure you click an empty area of your design surface prior to creating the variable.  That will scope it to the package itself.

In SSIS 2005 you could only write Scripts in VB .Net, however, in 2008 you have the ability to use C#.

Man!  Debugging in SSIS is awesome!  I can’t way to dive into this stuff at work.

Loops, Sequence Containers and Groups:

Loops are exactly as they sound.  The allow you to take a set of tasks and loop through them, much like a subroutine.

Sequence Containers do not loop.  They do, however, offer scope and environment properties.  You can define variables that are only available the that sequence.  You can also set transaction properties on that individual container.  Maybe you have several different containers each with their own transaction level.  Pretty neat.

Groups are simply a design feature.  They are simply there for organizing the items on your design surface but have no properties and no impact on the package at all.

The For Each Loop allows you to specify the collection your picking items from.  The possibilities include XML objects (for each node) and even SMO (SQL Management Objects) for things like “for each server do this”.  That’s pretty cool.

Bret showed a great example where he needs to process files every day that may not have the same filename.  For instance this may be an AS400 dumping csv files that were time stamped.  He generates what the filename should be in a Script Task, saves that into a variable, and then his File Task uses that variable as it’s connection string.  Sweet.  We need to do this as well and this would really help.

This was a great talk.  SSIS is definitely something I’ve been looking to get into ever since SQL 2005.  It looks like this is something I can dive into right away.  For the more complex things Bret recommended the book Professional SQL Server 2005 Integration Services from Wrox.

Unfortunately I can’t attend Bret’s next session since I’m attending Mike Roth’s Asterisk session.  Mike and I are working on a side project together using Asterisk, so I should give audience support and also learn what the heck we’ll be using! 🙂

 

Technorati Tags: ,

A Journey Through SQL Server Analysis Services

This is given by Ben Aminnia. 

He’s really putting an emphasis on planning, which is great.  Lynn Langit said her new book has the first 5 chapters devoted to what needs to be done before you even open BIDS.

Ben has an interesting presentation style, at least for this talk.  He’s giving it like his company has an existing Analysis Services project that we are interviewing to possibly take over.  I’m not quite sure yet we are on the driving end (we are really interviewing Ben to see if we want to take the job) or on the other end (he is interviewing us to see if we’re qualified to take the job).  I hope more of the former since I’m at the talk because I don’t know anything about SSAS.

While he is doing and interview style here at Code Camp he is assuming we don’t know about SSAS so that’s a good strategy to take.  He just gave a really good 4 minute background on Analysis in general and what a Cube is (since I raised my hand because I didn’t know! :))

During this talk Ben is using the example of a system that monitors disk space availability (among other data).  He actually has a great real world system for doing just this. It is based on the cover story of the February 2008 issue of SQL Server Magazine. You can find the online version here.  It’s a very complex SSIS package but it allowed Ben to do this without having to spend months developing it.  For him it works great and is easy to maintain. Ben has even made his own additions and discussed these changes with the original author. If you contact Ben directly (ben at sql.la) he can give you his additions.

Great talk!

Technorati Tags: ,

Recorded sessions from Code Camp?

I record the monthly presentations at the Inland Empire .Net Users Group.  I’m half way done developing the distribution area for our groups website.  This is taking a while because I have no free time!  J/K, I’m also guilty of putting in the kitchen sink.  I’m creating it so that users will have a single place for all session content including a Silverlight player for video, access to downloads such as slides and code as well as ratings and content.  Will be nice, if I ever get it done!

Anyway, back to the topic.  Is anyone interested in recorded sessions of Code Camp?  I know I sure am.  There are so many sessions by so many great speakers that it is impossible to see them all.  Why not record them and make them available after the show?

I’ve volunteered to Lynn Langit and Daniel Egan, who I know have vested interest in Code Camp, to do the work.  I just need funding for the resources.  It’s not expensive, but it’s not trivial either.  The setup I currently use for our IE .Net sessions is a VGA frame grabber from Epiphan (great equipment!), a wireless mic and my laptop.  That’s it.  Turns out when I went to the ESRI User Conference in San Diego they do the exact same thing, although they use Macs and a few mixers since they also have PA systems for the presenters.  They sell their week’s worth of presentation recordings on DVD’s for ~$400.  I’d like to make this content freely available for viewing on the web much like PDC.

This would be relatively easy.  I could get a few bodies to help set up and keep things running smoothly.

Code Camp 2009 has at most 9 simultaneous sessions.  Assuming this doesn’t grow I’d need to purchase and put together 9 recording “kits”.  If we went the “inexpensive” route, without any vendors kicking in free or discounted gear, we could probably build a kit for a little less than $1,000.  This would be for the frame grabber (VGA2USB model), a decent but inexpensive laptop, wireless mic and miscellaneous cabling gear.  We’d get 5-10 frames per second at 1024×768, which for 95% of the presentations would be adequate.  So, we’re looking at the need to finance ~$10k of equipment allowing for one spare.  Any takers? 🙂

Technorati Tags: ,

Code Camp 2009 – Day 1 Over

Well, this is definitely a blast.  Getting a lot of great info and great contacts.  Unfortunately I wasn’t able to attend the Geek Dinner or any other after party.  My niece is moving to North Carolina and tonight was her going away party. <sniff!> I guess that’s worth missing the Geek Dinner for. 🙂

Tomorrow’s another fun filled day!

Technorati Tags:

What’s new in SQL Server 2008 Analysis Services

Lynn Langit is presenting this one.  Her main career focus is Business Intelligence and has a book coming out.  I’ll have to pick this up.

This is an area that has always been of fascination to me yet I don’t know anything about it other than simple data reporting.  Working at the Val Verde Unified School District we have vast amounts of data and could really take advantage of BI.  Hopefully I’ll be able to use some of this with our new SQL 2008 install and start introducing it to various departments.  Maybe we could even use it in IT to analyze our support trends.  That would be sweet!

Excel has a great add-in to act as a client to BI data.  Excel is one of the primary tools used at our organization, especially by our business office, so this would be a great sell.

Wow!  Lynn just showed how she took her Excel spreadsheet of local data and analyzed it using Microsoft’s own services.  She asked Excel to analyze it, it reported she wasn’t connected to SSAS, she said she wanted to use Microsoft’s data services in the Cloud, and then it sent the results back.  This took only a few seconds.  Yes you are sending your data to Microsoft and yes this service is only free during beta testing, but this is amazing.  Could really open doors for cheap introduction to SSAS and BI in general.

Check out SQL Server Data Mining in the Cloud for details on how to do this.

At VVUSD we have the mentality that we’d rather pay up front had host our own services, but using this might be a great way to sell research in this area in the beginning.

Lynn gave a great explanation of SQL Data Services.  It is SQL Server that has been optimized for performance and high-availability.  Thus many features have been removed and interaction has been restricted.  There are schemaless containers via Authorities, Containers and Entities (think property bags).  Also, the only query language currently is LINQ.  This all may change but that’s the current state.

Wow!  Once you have a validated and fairly good model Lynn showed how this can be used real-time.  You can hit your model real-time, such as from a form your sales guy is using, and get instant prediction about your current state.  For instance, if you’re selling bikes, and in the first 3 questions your sales guy finds that this potential customer is in the bottom 10% of likelihood to buy a bike, they can thank the customer and hang up right there.  Instantly, in real time, they have found out the likely results of their work.  We could use this in our school and, based on grades, get instant feedback as to how successful a student might be.  Education organizations spend millions in this area so I think our work would be cut out for us! :)  In the IT dept, for example, we could possibly instantly predict the support costs we’ll incur from a department (or specific user) and hardware (laptops, phones, etc) based off of previous support calls.  So if a user who travels a lot and is particularly rough with their equipment asks for a new laptop we may find that it’s more cost effective to buy a better warranty or a tougher laptop (or deny the request all together).

This is definitely a large area (full careers in themselves) but something that would definitely be worth spending some time on if possible. 

OK, I’m definitely going to have to pick up a book.  This is all so new to me that most of this is over my head, but the demos are absolutely amazing.

Real World SQL Data Services

James Johnson is hosting this one.

The interesting thing about SDS is it sounds like more of a property bag storage.  That brings a whole host if questions like how do you manage millions of rows, indexing, querying, etc. 

The Entity itself is the property bag, whish is within a container.  Each container is contained within an authority.  It really sounds like an Authority relates to a database, a Container relates to a table and the Entity is the item.  This is exactly how we shouldn’t think about this though! :) 

I’m assuming that once in production the containers will be distributable and “replicatable” across datacenters.  If you also have your app and services in the cloud then you could really take advantage of multiple instances and distribution across the globe.  This would really be amazing.

James suggested checking out Mike Amundsen’s SDS utility.  It allows you to create and view your Authorities and what Containers and Entities they contain.  You can also see the raw XML response.  One nice thing this shows is that SDS is fairly responsive as it was a pretty smooth running app.

SDS uses XML (through REST) to transfer all its data.  LINQ and XPATH should be very useful to process this data.  I wonder what’s the best way to hydrate this data into an actual object that you can pass around.

There were some questions as to whether this data is also available as JSON, since it’s ultimately handled by WCF.  Since your authentication (token and password) are required to access the data I don’t see this being done in an open channel, such as through AJAX on the user front end.  Since this will most likely be used entirely within the .Net backend I don’t see a reason to use anything other than XML.

James wasn’t able to upload a BLOb and its metadata in the same Entity.  He had to create/upload one Entity with the EntityID and the file data and then another with the metadata and a “foreign key”.  I wonder if this is really a limitation of SDS.  If so it will probably be addressed before RTM, but in the mean time maybe a multipart XML upload would work.  This may also be a limitation of REST and WCF, where again a multipart message body would probably work.

So now James has these two containers which need to be “joined” together.  He downloaded both containers and then used LINQ to join and manipulate them.  That works but it means that you have to download all the data prior to getting to LINQ.  What would be better is if you could use LINQ-to-SDS. 🙂

It looks like there’s no official LINQ-to-SDS, however, if you check out the PhluffyFotos example there is a LINQ-to-SDS provider.  I’ll have to look at this.  In a previous session Bret Stateham showed how to embed lambda expressions in the REST query, which I’m told is called Dynamic Language Invocation.  Definitely worth checking into.

 

What is the Microsoft Cloud Platform?

Woody Pewitt is giving this talk.  It’s another intro talk for the rest of the Cloud topics through out the weekend.

Honestly, regarding Cloud computing as a concept, rather than any particular vendor’s platform, is a game changing shift.  Much like hardware virtualization is a game changing concept for data centers and development/testing (this is a whole other topic) the Cloud concept is changing the way the web works from a hosting point of view.

While in large corporate environments you have network managers, server administrators, DBA’s and they handle the resources at a fundamental level for the most part developers still have to dabble in these areas.  That’s a great thing because I’m still in the camp that developers should understand what platforms they are dependent on, be it IIS, Windows server 2008, .Net 3.5, SQL 2008, etc.  However, once I am intimately familiar with my platforms I don’t want to have to spend 50% of my time provisioning servers, installing patches and service packs, doing routine optimizing, hardening against intrusion, replacing drives or power supplies, migrating to new servers, etc.  Having the Cloud allows me as a developer to offload all this work to a 3rd party. 

OK, so that’s no big.  I do that now with inexpensive hosting such as GoDaddy, colocation of my own servers, or my own data center managed by server admins.  What’s so new about the Cloud?  The great thing is the services I use (hosting, data access, etc) are now abstracted.  If I host my site on GoDaddy or any other of the examples above I still have to interact with my web hosting software and databases as if I was still directly connected to them.  I still have to work with the mindset that my web app is hosted on a single server, connecting to a single database at a single datacenter.

What the Cloud does is abstract all of this.  I can code my app to have a light weight UI that passes requests to small chunks of service code which hits a abstract data store.  This allows my Cloud vendor to not only store all these parts in different locations but also mange multiple instances of them.  If my app starts getting a large number of hits and the data store is slowing down, they can replicate this data across multiple hosts.  If my UI starts to get sluggish they can host the front-end across multiple web hosts.  This allows my app to scale almost indefinitely without my direct involvement.  Not only can it scale in the number of running instances, but geographically as well.  Let’s say I have an airline ticking company that services flights from the USA to Europe.  In the off season it may receive a fairly light amount of hits.  However, come Christmas time I may get thousands of hits per hour.  Not only that, half my customers may be in the USA and the other half in Europe.  Allowing my application to scale out to multiple instances and multiple geographic locations, automatically without my direct involvement, is absolutely amazing.

This is where web hosting is going.  In 10 years almost no one will be hosting on individual web servers except for in specialized locations.

OK, off the starry-eyed platform.  Back to the talk.

Woody, and Lynn Langit who is in the audience, just mentioned the sign on process.  It’s a nightmare!  Even some of the internal MS people don’t have keys.  I think with all the talent that’s in MS they could have designed and built out the whole sign-up and beta key process much better.  It’s terrible to put it lightly.  The forums are filled with disgruntled users that can’t log in and have no idea what state their application process is in.  This is one of those examples where a little more time and care up front would have saved hours and hours of support calls/emails/postings and upset and confused users.  OK, that being said check out Lynn’s post on this very subject.

Microsoft is not publishing any dates what-so-ever, however, Woody said there is another PDC planned that will announce the RTM availability of Azure.  His guess, and that’s all it is, is that Azure will not be ready for commercial release for another year.

Woody was putting forth the idea that this really abstracts and commoditizes the data center.  These are things that web app designers (large and small) will not have to worry about.  In fact, Microsoft is in reality commoditizing their data centers.  Check out Woody’s post here showing a great little video on how they are doing this.  The grand plan is to literally order a “truck”, which is a fully self contained datacenter with just three “plugs”: power, internet and A/C.  These trucks will be sent to a warehouse built for this capacity by Microsoft.  These centers can be quickly deployed to any area that Microsoft’s researched has deemed a high volume area.  If a trailer starts to sense that it’s at a specified failure load, like 20%, it will be decommissioned, degaussed (as it leaves the facility) and replaced by a brand new truck.  Pretty amazing!

LINQ 101

Daniel Egan is giving this talk.  He was one of the ones in charge of setting up the Code Camp schedule.  He noticed that while there were several LINQ topics there were no intro topics so he decided to give this one.

Daniel is doing a great job, however, it’s a 3 hour talk condensed into one so he’s having to skip a few things.  Couple that with projector problems and we’re really short on time.  We’re going over basic ORM concepts and new properties in VS 2008.  I don’t know that he’s actually going to get to LINQ! 🙂

I saw a talk about a year ago at the Microsoft Irvine offices where the presenter went through about an hour and a half of seemingly disconnected features of VS 2008.  Then suddenly when he started talking about LINQ in depth under the covers it all made sense why these features made it into VS 2008 and how LINQ really works.  I think Daniel is going through the same talk, but as I said I don’t think there is a lot of time.  Man I wish I remembered who that presenter was.  The presentation was one of the best I’ve seen.

Daniel did mention automatic properties.  He doesn’t like them either! 🙂 They are fun, quick and do enhance readability, however, the instant you need to add checks or some business logic you have to rip apart the auto property.  So it doesn’t really do you much good.  ReSharper sort of tries to help but not really (maybe I just don’t know the right command).  If ReSharper could take an auto property and change it to a fully fleshed out get and set that would be great.

He’s going over extension methods now.  This is one of the most usable enhancements of .Net 3.5 in my opinion.  This can create a huge amount of dependent code and a big mess if not used properly.  However, if a corporation has an internal shared library of extension methods that can be great.  If you are working on open source projects then you have to make sure the extension methods are included in the build.  I think there will eventually be a lot of “common” extension libraries, much like Apache Commons for Java, on CodePlex and other sources for people to download and use.  Then everyone can make use of common extensions that are freely available and help everyone to code consistently. 

Daniel doesn’t like var!  :)  There are definitely mixed feelings about this.  I haven’t used it enough to weigh in on one side or the other.  It’s just interesting to hear different reactions.

Unfortunately that’s all the time I have.  I have to run to the next session.  Daniel is going to try and squeeze every minute he can but I have to be in another building by 10am.  He’s not finished yet and hasn’t gotten to actual LINQ but did cover some of the foundation behind the scenes. In depth LINQ will be covered by many of the other sessions.

I’m off to “What is the Microsoft Cloud Platform?” by Woody Pewitt

Technorati Tags:

The Beginnings of Code Camp 2009

I’m at Code Camp 2009 at Cal State Fullerton this weekend.  If you don’t know what this is and have the weekend free drop by.  There are a lot of great sessions from very talented speakers.  Chances are there are at least a few (if not dozens) of topics that you’ll want to know more about.

There are too many sessions that I want to attend so I’m trying to keep this a little more real world.  As I’m starting a project that is going to be using Microsoft Windows Azure and SDS I’m attending a few of of those.  Also, we’ve just installed SQL 2008 at my work so I really want to check out the BI And SSIS portions.

Here’s the schedule I’m following:

Saturday – Jan 24

  • 8:45 AM – LINQ 101 by Daniel Egan (Room: H 123)
  • 10:00 AM – What is the Microsoft Cloud Platform? by Woody Pewitt (Room: UH 339)
  • 11:15 AM – Azure Cloud Application Model by David Pallmann (Room: UH 339)
  • 1:15 PM – Real World SQL Data Services by James Johnson (Room: UH 335)
  • 2:30 PM – What’s new in SQL Server 2008 Analysis Services by Lynn Langit (Room: UH 335)

Sunday – Jan 25

  • 9:00 AM – A Journey Through SQL Server Analysis Services – From Planning to Implementation by Ben Aminnia (Room: UH 335)
  • 10:15 AM – SQL Server Integration Services Control Flows by Bret Stateham (Room: H 110)
  • 11:30 AM – Asterisk The Open Source PBX by Mike Roth (Room: UH 250)
  • 1:30 PM – Active Directory Programming for Developers: Level 1.5 for .Net 3.5 by Steve Evans (Room: UH 339)
  • 2:45 PM – Active Directory Programming for Developers: Level 2.0 by Steve Evans (Room: UH 339)

 

Technorati Tags: