Alternating Ordered List Item Styles

We have Policies and Procedures document that we are making available on our website.  This document in its entirety easily spans several hundred pages and has a pretty lengthy table of contents. 

To help make this easy to navigate and easier to consume we are posting several of the larger topics as individual documents and making them available from a linked table of contents that is created as a web page.

The requirement is that the table of contents uses different numbering styles at each indentation.  So, for instance, the outline below:

Topic A

   Sub-topic A.1

   Sub-topic A.2

      Sub-topic A.2.1

      Sub-topic A.2.2

Topic B

Topic C

would become:

1. Topic A

   a. Sub-topic A.1

   b. Sub-topic A.2

      i. Sub-topic A.2.1

      ii. Sub-topic A.2.2

2. Topic B

3. Topic C

This document isn’t maintained in a database yet so for now the table of contents is created in good old fashioned HTML.  Using a standard <ol> (ordered list) element in HTML looks initially like this:

  1. Topic A
    1. Sub-topic A.1
    2. Sub-topic A.2
      1. Sub-topic A.2.1
      2. Sub-topic A.2.2
  2. Topic B
  3. Topic C

The way to change the numbering style is by using the CSS list-style-type style on the <ol> element with various options such as decimal, lower-alpha and lower-roman.  You can find a great example of this at w3schools.  So, I could manually change the list style of each <ol> tag based on how much it was indented, but this would be a pain to maintain.  Anytime we changed the layout I would have to change the list-style-type styles and be sure to keep everything consistent.  I thought this would be a perfect place to use jQuery.

Using jQuery I was able to define an array of the types I wanted to use.  jQuery would then find all the <ol> and nested <ol> elements and change the style for each indentation.  Once done I could simply maintain the table of contents and jQuery would handle the styling. I couldn’t find any jQuery plug-ins that did this so here is the code I came up with.  If I end up using more than a couple of times I’ll probably create a plug-in for it.

1: function styleOutline() {
2: // List-item-styles we would like to use
3: var olStyles = [‘decimal’, ‘lower-alpha’, ‘lower-roman’];
4:
5: // Process the parent element.
6: styleOl($(‘ol:first’), 0);
7:
8: function styleOl(element, styleIndex) {
9: // Apply style
10: element.css(‘list-style-type’, olStyles[styleIndex % olStyles. length ]);
11:
12: // Call recursively for each nested ol
13: element.children().each(
14: function (i) {
15: var ol = $( this ).children();
16: if (ol) {
17: styleOl(ol, styleIndex + 1);
18: }
19: }
20: );
21: }
22: }
23:

This works great and took me only a couple of minutes to do. Definitely an improvement over the manually written way.  I hope this helps someone else out there!

Hacking your vendor’s product; sometimes good, sometimes not

We have a particular web product that we use for all of our clients.  It is a pretty incredible system and has several hundred clients.  We are one of their largest ones so, between our implementation, and their normal customer needs its understandable why they cannot accomodate every one of our requests right away.

That being said, there was one slight “feature” that went slightly against one of our policies.  So, what to do?  Well, I put on my curiously tipped white hat and go to work.  One of the cool features of the system is they allow us, the admins for our group for all intensive purposes, to update portions of the page for our users.  They don’t escape HTML so it turns out I’m able to inject my own JavaScript code.

Ooohhh, that’s bad.  😉 I let the company know but I proceed on.

Turns out they, being the creative bunch they are, make use of jQuery.  This is definitely turning into a possibility with my toolbox already stocked and ready to go.

The particular portion of the page I choose to launch my JavaScript from is ideal because it is on the navigation bar, ensuring that it will be displayed on almost every screen.  Because we can only change content for our users it ensures that my changes will only affect our users and not those of their other clients.  However, with 5,000 users I had better make sure my code is well tested and clean.  I still have the ability to disrupt my 5,000 users if I make a bad mistake.  Point noted.

The vendor puts a character limit on the particular area of the page I’m changing.  That means I have to inject JavaScript that tells the browser to load a larger script from elsewhere.  No big, I put the script on our department website.  Hmmm, not so good.  The system is now throwing a warning that I’m loading JavaScript from a non-secure source.  Hmm.  Sure would be nice if I could load the script directly on my vendor’s server.

Well, I can. 🙂  The vendor also allows us to upload documents to a library that our users can download from.  So, I upload the script.  Uh, oh, didn’t work.  Turns out they block all but a few extensions.  So, instead of calling it myscript.js I change it to a text file called myscript.txt.  That uploads great and, guess what, the browser is happy to execute it.  Great!  On my way.

After a few tests it turns out I’m able to quite nicely make our little “issue” a thing of the past.  First I check to make sure the page in question is the one being viewed.  That way the bulk of the code (all 6 lines of it) runs only when necessary.  Then it cleans up after itself by removing my injected code from the page DOM.  Nice!

It’s not foolproof of course.  If the user isn’t running JavaScript then my trick won’t work, however,  half the system won’t work for them anyway since JavaScript it is required.  Also, if someone is sneaky enough they can load the original source and see where I injected my code.  But, if they are going to do that we have bigger problems then them simply disabling my script.

OK, that covers the techy issues, what about the people ones?  Well, as you may have guessed, no matter how many times we alerted our users to the changes we made, some users started calling the vendor’s Help Desk asking what happened to the report they were used to.  The vendor then called me after spending two days trying to solve the issue they weren’t seeing.  They didn’t really care for the changes.  They understood the necessity and gave me props for how I handled it but now their Help Desk had no idea if a problem call was because of their system or my code.  Granted, this totally makes sense.

So, they gave us a choice.  We can continue to develop our own “customizations” but we have to take over all Help Desk calls for our users for the entire system.  I understand that, but directly supporting 5,000 users is not something we are capable of doing.  I have to note also that the Help Desk is absolutely outstanding, one of the best  Ihave ever worked with.

So, after the vendor agreed to look into implementing our change we are allowed to keep the code and support the Help Desk for any questions regarding this one issue.  Once the vendor has solved the problem we will remove it and go on our way.

So, here is the moral of the story: Hacking your vendor’s site open doors to lots and lots of capabilities, however, it may aggravate your vendor a little or, quite possibly, cause an early termination to your contract.  I suggest you keep this little tool in your developer’s pocket for extreme cases.

It was fun and our vendor is great.  Once I put the change in place we started to come up with lists of additional “nice to have” features I could implement, but, in the end, we’ll just hand those over to our vendor and see if they make it in someday.

Windows 7 Phone, on it’s way!

We had a great time at the Inland Empire launch of the new Windows 7 Phone.  I got to present on developing for the phone along with Dustin Davis and Oscar Azmitia.  James Johnson and the Inland Empire .Net Users Group hosted the event at DeVry University.  Fun was had by all.

I recorded the sessions so I will have them up soon for all to see how we did.

Take care and happy developing!

Why we switched from WebEx to Adobe Connect

We had been using WebEx at my work to allow us to hold online meetings.  We are very small scale and only use it once or twice a month, but when we do it really comes in handy.  Once a month or so we have a user group meeting with our 22 districts across Riverside County.  If all our attendees came in person some of them would be driving over 4 hours roundtrip for a 2 hour meeting.  Not fun nor economically and environmentally friendly.  Online meeting software was definitely a requirement here.

WebEx was what I was used to and hadn’t seen anything better from GoToMeeting or the others.  So we signed up with them.  It works and we didn’t have any problem with it.  It’s just a little sterile and complicated for non-techie users.  Plus our vendor never returned our emails regarding renewing our contract.  So much for that.  We can’t even give them our money. 😐

One of my co-workers teaches an online class for USC.  They use an online teaching tool that makes use of Adobe Connect.  He really liked it so he told me to check it out since our WebEx contract was coming up for renewal.

After playing with it I really began to like it.  It offers many of the same features as WebEx but it is just more fluid and has a much nicer UI.  It is Flash based so it avoids many of the ActiveX or browser plug-in issues that WebEx has.  Since Flash is pretty much installed on any computer it works out of the box.  Also, Adobe Connect allows us to let guests enter a meeting just by connecting to the URL.  This is really nice as users had to enter a password with WebEx.  Password protected meetings are just one extra step to trip up non-techie users and completely unnecessary in our case. 

Also, we can set a permanent URL for a meeting we hold regularly.  This allows users to bookmark the meeting or bring it up from an old email.  With WebEx we always pointed users to our general WebEx page.  We usually only had one meeting on the calendar so it wasn’t that confusing, however, for larger organizations it can be a bother to sort through all the scheduled meetings.  Having a permanent URL that connects you straight to the meeting is a nice touch.

And guess what?  It’s cheaper for our account as well.  WebEx basic pricing runs about $60 a month for an annual subscription, which is limited to 25 concurrent attendees.  They will let more than 25 attendees connect but you will be charged for the overage.  Adobe Connect basic pricing runs about $45 a month for an annual subscription and allows up to 100 concurrent attendees.

All in all Adobe Connect is friendlier, easier to setup and less expensive.  Definitely a win for our needs.

If you are considering online meeting software definitely give Adobe Connect a look.  Also, if you do go with Adobe Connect, consider MeetingOne as a vendor rather than contracting directly with Adobe.  Check out my previous blog post about how their personal customer service really saved our day while also getting us a cheaper rate.

Praise for Meeting One – Adobe Connect vendor

In today’s Internet world where everything is automatic and at the tip of my finger the need (and room) for personal service quickly disappears.  As a fast paced individual myself I actually lean more towards automated solutions.  Often I can find an answer, directions, a phone number, a solution on a development or IT problem, etc in a matter of seconds on the Internet rather than asking someone.  It’s because of this, and having to deal with pushy vendors in various past, that have made me largely anti-salesman.  If I can purchase something online with a credit card number and a few clicks of a mouse I’m much more inclined to go the non-human route than to get someone on the phone.  I’m somewhat introverted so that may also be part of the issue. 🙂

However, as a retail software salesman in one of my many past careers, a personal technical consultant and having dealt with many great small businesses I know that the personal touch is still very useful and worth the money.

In this particular case it saved my day.

We recently switched from WebEx to Adobe Connect for our online meeting software at my work.  You can read my post about why we switched here.  Mark Stevenson of MeetingOne contacted me as an authorized vendor for government contracts for Adobe Connect.  He was very nice on the phone and answered all of my questions.  However, I was still inclined to thank him, hang up and sign us up online through the Adobe web page.  Mark was able to get us a small discount so I gave him the benefit of the doubt. 

We are a very small contract.  I am the only staff member who uses online meeting software and I only 1.5 times a month on average, yes, only 1.5 times.  He had almost no incentive to give us a deal or even continue talking with me.  Honestly, if this was a fortune 500 company (MeetingOne very well could be, they just don’t give off that impersonal vibe, a compliment I assure you :)) his sales lead analysis tool would probably have already told him to hang up as it wasn’t even worth his time to keep talking with me on the phone.

I had previously signed up online with an Adobe Connect trial account while we were researching various products.  We had a meeting today but, as we are still in the middle of our purchasing process, we were forced to use our trial account.  I didn’t think too much of it as there is no indication that the trial accounts are limited to the number of connected attendees.  However, sure enough, after 5 attendees connected to the meeting the phone started ringing off the hook.  Users were getting turned away with an error message that the account had exceeded its usage.  Ouch!  What to do?

I called up Mark right away and he saved the day.  After talking over the issue and our meeting needs he set up a meeting for us under the MeetingOne account.  I sent out the new link to all our users and within minutes we were up and running full steam.  He even stayed in the meeting while he was working and assisted with a couple of technical issues we had while switching between presenters.

I have a very large suspicion that we wouldn’t have received this type of support had we contracted directly with Adobe.

So here is a BIG THANKS to Mark and MeetingOne.  He has helped change my attitude towards vendors.  The good ones are still out there, it’s just hard to find them sometimes.  Fortunately Mark found us. 🙂

If you are in the market for online meeting software, consider Adobe Connect.  If you do, please consider MeetingOne.  They are one of the good guys and will definitely help you out.

Apple releases its MacPaint source code to the Computer History Museum

This was a fun article I just ran across at Business Week:

http://www.businessweek.com/technology/ByteOfTheApple/blog/archives/2010/07/apple_donates_macpaint_source_code_to_computer_history_museum.html

I used Mac Paint on the Mac Plus and I would definitely call it revolutionary.  Yes the Amiga, and several others, came out with competing or better products, however, for me in a new Mac world coming from the IBM PC clone world this was amazing. 

And, in contrast to one commenter who said it was a neat demo and nothing more, we used this in our desktop publishing all the time.  We could finally create and manipulate graphics and logos for newsletters, business cards, etc.  For a small business it really made us stand out.  Hardly anyone in the mid-80’s had this capability for so cheap.  It usually required a large print house with expensive machines.  Now we could put a real professional touch on customer documents.  B/W art was definitely used on probably 90% of print output at the time so having a color capable utility didn’t offer much when it came to a hard copy.  MacPaint really allowed us to push the boundaries.  With a Mac SE/30 and an Apple LaserWriter II we were producing high quality professional documents for clients for less than $7,000 in the late ’80s, which at the time was simply amazing.

I’m sure others out there could have done something similar with Amigas, PCs, etc, but for us this was a game changer. 🙂

I love seeing recaps on older history like this, for any company, not just Apple.

A Great Resource for Different Strategies on Concatenating SQL Results

In every DBA’s career I think having to concatenate results happens at least a few times.  Probably more than we like to admit because we tend to live in table-land.  :)  However, there are those occasions, which are usually driven by some downstream requirement to format output.  Now, I know that formatting should be handled by whatever data viewing method you are using, but sometimes that just isn’t possible or practical.  Other times it may just be that we need to transform data from one system to another and that other system is not as normalized as the tables you are working with.

Like I said, I do it fairly infrequently, so I never remember the best way in my head.  I usually end up looking at how I’ve done it in the past.  I started thinking that there may be better ways then some of the convoluted strategies I’ve found in previous solutions.

Trusty Google sent me here:

http://www.projectdmx.com/tsql/rowconcatenate.aspx

It’s an incredible (though certainly not exhaustive) list of ways to deal with this depending on your need.  I like XML and chose to go with simplicity so, for this particular task, I went with Eugene Kogan’s “blackbox XML” method.  It’s only a few lines and if you are familiar with XML and SQL then it’s not that hard to understand.

I’ve definitely bookmarked this for later reference!

We’re all still kids, just with bigger toys

I just saw this article:

http://www.tomshardware.com/news/SFF-RAID-HDD,10319.html

It’s an amazing feat by Will Urbina.  He has an amazing tool shop and knowledge on how to use them.  He custom built his own Small Form Factor case that holds 8 2TB drives for a total of 14 TB raided.  Pretty amazing.

I would count this as a great version 1.0 product.  The two changes I would make for v2.0 would be:

1) Accessibility: I would mount each drive into a removable tray. Changing the drives out is going to be a bear. Having removable trays would not only make this a snap but also allow for hot swapping, a real help when dealing with RAID failures. If you can get a tray with built-in heat sinks for the drive that will lead nicely into my next recommendation.

2) Heat: The heat issue should easily be solved by putting heat-sinks on the drives (or heat-sink trays as mentioned above). Then simply seal the box, put vents on the front left side and channel the air through the front left, across the drives to the right and through the back. The drives will need more spacing to accommodate the trays and airflow but maybe you could switch to 2.5 drives. This will definitely take some engineering, especially to work around removable drive trays, but proper sealing and a good fan in the back would give a good airflow. May need to increase fan speed or add additional fans to the front intake.

 

Awesome idea.  I have a few tools in my tool shop, some of which were gifts, others bought new for projects, and others picked up on Craig’s List or garage sales.  With only a 2 car garage and 3 young kids I have no workshop. :)  But, I have the dream of slowly adding to my tools each year and someday building a sizable work shed in the back.  I would love to be one of those dads that has tools and builds things with their kids.  I always envied my friends who had this and their dad’s shop at their disposal for inspired ideas or school projects.

Ideas like this keep me hoping that this will be a reality someday.

Great job Will!

Spell-check, suggest as you type, etc – Are we efficient or just lazy?

I’ve been using spell-check since the day it was available years ago on my first computer and word processor,  a Victor 9000 with Multimate.  This was back in 6th grade for me and later through my academic career.  I have great belief that my lack of ability to spell complex words off the top of my head are directly related to the fact that I didn’t have to know how to spell all the time.  I used a computer for that.

Technology progressed from there and soon Microsoft had Word correcting misspelled words as we typed.  Soon they were offering grammar suggestions too (although not very well in the beginning).  Then they started correcting my tenses for me automatically, along with capitalization, correcting my case and turning off my caps key if I suddenly starting typing a sentence like “yESTERDAY I WENT TO THE STORE.”. 

Now, sites like Google and such actually “guess” at what I’m trying to say, making it so I don’t even have to type my entire though.

It’s all about saving time and making us more efficient, but are we just being lazy?

Honestly, I love how far we have come.  Non-audio communication (for my experience anyway) has always been drastically slower than audio based communication.  I can put forth a concept talking with someone a hundred times faster than if I were to write it in an email or in a memo by hand.

In school before computers were mainstream I handed in all my written homework on a computer.  In fact, if I was given a homework paper to fill out, I would either duplicate it on a computer or type it out on a typewriter.  I wasn’t being neat, I hated handwriting.  It was just too slow! 

So, rather than being lazy I’d call myself inpatient.  Do I take this for granted?  I think I actually take advantage of it. It’s not a crutch but a feature.  I know Word will correct my capitalization.  I know my iPhone will add a period and put my Shift key on when I enter a double space, signifying the end of a sentence and the start of another.  So, I simply don’t do this anymore.  It’s actually funny watching me type in a plain vanilla word processor like Notepad or an online webform, because I see how much I have come to use the built-in optimizations.  I actually anticpate and take advantage of the fact that my typing is being corrected for me.  Why do I need to bother holding down the shift key at the beginning of every single sentence?  I know the software will do it for me so that’s one less key I have to hit every time.  Why do I have to type in apostrophe’s in words like won’t, you’re, or I’ll.  There is no other word these could possibly be so my iPhone puts them in for me.  This is a speed boost feature, not a tool for inept typists.

In other areas of life we don’t even think about this.  For instance, in the development world we have development environments that allow us to write lines of codes with only a few keystrokes.  The software makes suggestions as we type and even makes recommendations on how to clean up our code.  This isn’t considered lazy but actually considered a feature because it makes us not only faster programmers but helps us write more consistent and higher quality software.  This is an investment by our employers.  They don’t want to pay us to type mundane lines of code when we don’t have to.  They’d rather spend an extra few $100 on our tools and pay us to think, get their product to market and start selling it faster with a higher rate of quality.  They don’t think of it as giving us tools to make us lazy but to actually get a better return on their investment.

Enter the iPhone and it’s text input system.  Since the keyboard is entirely touch screen and can be a little small it is quite common that you’ll actually hit a letter adjacent to the one you meant to type.  So, what does Apple do to help this?  Why, every word you type is checked against a dictionary.  If it doesn’t recognize the word you type it attempts to find a match using all the letters adjacent to the ones you typed.  So, for instance, if I accidentally types “hekko” it might suggest “hello” as an alternative.  Other neat features are like I mentioned above.  Since the screen ahs limited space, unlike a full-sized keyboard on a computer, they attempt to maximize the space they have and minimize the amount of context switching.  What I mean by context switching is changing from letters to numbers or symbols, typing punctuation, using international characters, etc.  For isntance, if I am filling out my email address they put the @ sign as one of the keys on the main screen.  I have to use an @ sign every single time I type an email address, so why not put this on the main keyboard with the rest of my letters when entering email addresses?  When I am typing in a web address they have a “.com” button.  What if I want to go to a .net or .org address instead?  If I hold down the .com button after a brief pause it will open up and allow me to drag to any number of common suffixes, such as .org, .net, .gov, etc.  There are tons more but you get the point.

I never realized how awesome these little changes might be until I got my own iPhone.  I type on it all the time and they are life savers.  What the iPhone doesn’t due (as highly criticized by users and iPhone opponents) is that there is no spell check.  I completely agree and expect Apple to remedy this shortly.  Why a device that can predict what I meant to say, take video, allow me to find where I last parked, count my food intake, suggest movies near me, etc can’t even offer to spell check my words in this day and age is beyond me.  But that’s another story.

Now, if you’ve seen the recent Samsung commercials, there is a new texting technology on their Omnia phones called Swype. It allows you to type simply by dragging your finger rather than physically pushing down and then lifting up your finger on each button.  Is this faster?  Well, that all depends on your typing style and comfort but I could see this being a game changer for those that like it.

In the end, what’s the best?  Well that’s all relative, but for my money it seems like we have a lot of good ideas, all going in opposite directions

Why can’t I have a Swype input that suggests as I type, corrects my spelling if I hit an adjacent letter, corrects my spelling if I misspell a word in a common way or use a grammatically incorrect tense or pluralism.  that would be the best.  Combine Swype with the iPhone and Microsoft and I’d be set.  If I am on a standard computer with a full sized keyboard some of the options like Swype no longer make sense, but I still like the double space that converts to a period and turns my shift key on among many others. 

That’s where my money is.  Hopefully it won’t take too long.  I’m sure there will be patent wars but in the end hopefully us users get the benefit of all these typing optimizations working together.

Building a .Net 3.5 Web App on Windows Server 2000 with only .Net 2.0

I am upgrading an older web app of ours as I referenced in my last blog post.  This was originally a straight html app with no dynamic content at all.  I created a .Net ASPX web app out of it and used LINQ to quickly and easily create a survey form that our users could fill out.  It worked great on my machine.

Unfortunately the happy ending got derailed when I deployed it to our web server.  Our web server is an ancient Windows 2000 server box with IIS 5.  this is because it’s where all our main apps our housed, everything works and there is great fear in changing it.  <sigh>

So, I either had to figure out how to get my .Net 3.5 app running on IIS5 with .Net 2 or I had to abandon LINQ and go back to data readers (yuck!).  I first tried to install .Net 3.5 on the web server but quickly found out that it requires Windows XP or Server 2003 as a minimum.  OK, so that’s ruled out.

I knew that the asp.net framework has always been 2.0 (until the new release of 4.0 that is) and .Net 3.0 and 3.5 just added extra features on top of 2.0 but never changed the underlying base classes.  So you can easily use .Net 3.5 apps on a .Net 2.0 web server.  In fact, this has caused a lot of confusion because there simply is no 3.0 or 3.5 selection in IIS for the .Net framework.

I knew if I could just reference the required .Net 3.5 dlls then this shoudl work.  Doing a quick search on Google lead me to this great article.  I was wondering if something like this was possible and, sure enough, it pointed me in the right direction.

 

Here is what I did and it worked like a charm.

I first set my build target for the web app in Visual Studio 2008 to .Net 2.0.  This caused VS 2008 to instantly remove any non-.Net 3.5 compatible references such as LINQ.  I did a build and received numerous errors, most pertaining to my code that made use of LINQ.

I copied the System.core and Linq.Data DLLs into my web app’s bin folder and referenced them.  After another attempt to build the solution the LINQ errors went away but it still didn’t understand my lambda expressions or my auto-properties.  This makes perfect sense.  These are compiler features and not referenced code.  Since, by default, asp.net compiles on the server it had better understand these.  I could change the autoproperties back to normal properties but there is no lambda equivalent for .Net 2.0.

So, I created a new project and moved all LINQ code into it and had it target .Net 3.5.  Having my data access classes in a separate project felt much cleaner and probably would having been an eventual refactoring later.  I removed this code from the web app and created a reference to the new project.

Ran a build and received the welcome success message.

I then deployed the web app to the web server.  Upon opening one of the new pages, which runs a LINQ query to obtain some data to populate a drop down list, I received the following error:
Could not load type ‘System.ComponentModel.INotifyPropertyChanging’ from assembly ‘System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089’

After some googling it turns out that INotifyPropertyChanging wasn’t introduced until .Net 2.0 SP1.  Sure enough, our web server had 2.0 but no service packs.

I installed .Net 2.0 SP2 and everything worked great!

 

I am in the process of redesigning our entire department’s website and that is all built on MVC and several other current technologies.  I have another web server that is running 2003 for that.  I might miss out on some of the newer IIS 7 features but .Net 4 runs on it just fine so at least this is a major step forward.

Take care!