ESRI 2008 UC: ArcGIS Server – The Developers Perspective

All I can say is WOW!  I should have gone to this one instead of the Web Application Interface Design: The Design Process.  This is what I was looking for.

Unfortunately I missed the first 20 minutes but they showed some incredible access to GIS data and tools using REST and SOAP.

lf you are web app developer and deal with a lot of data definitely check out REST.  It’s a really neat strategy for manipulating data.  With .Net 3.5 Microsoft is building it into their data access model.  From my buddy James’ presentation on it it’s still a little immature, but it’s getting there.

Anyway, when you publish a map or other resource to ArcGIS Server it is instantly available via REST and SOAP.

So, just think.  If you need to get access to layer files or perform some sort of processing you can just use a standard url to grab it.

That’s about all I can say on REST because I only saw a bit of the last demo.  Then they switched over to SOAP.

ESRI has a longer investment in SOAP so, while it’s a slightly older technology, it’s a little more mature in its offerings.  ArcGIS Server exposes a few more abilities in their SOAP architecture, and with the built-in web resource features in Visual Studio it’s very easy to add to a web app and use.

I haven’t jumped into SOAP really so I can’t comment that much on it.  I’m jumping on the bandwagon only now and I’ve already dived into the JSON and REST side of things.  I’ll probably delve into SOAP if I can’t get ArcGIS Server’s JavaScript API to do what I want.

I asked the presenters about  security.  They said if it’s an Intranet based app then we can take full advantage of Active Directory.  He seemed to indicate it was built in to the Server and not really a function of the REST or SOAP interface at all.  Hopefully I’ll find out more about this in my ArcGIS Server sessions.

Anyway, as far as public access REST and SOAP can take advantage of token based accessibility.  It sounds like I should be able to roll that in with Microsoft’s ASP.Net MembershipSecurity framwork.

I don’t really relish opening ArcGIS Server directly to the public.  One of the demos used an ImageHandler.ashx handler that retrieved the image from the Image Server and streamed the result to the user.  This way the web app acted as a middle tier between the user and the server, thus allowing you to keep the server internal and manage access to the outside world.

Can’t wait to play with it!

Technorati Tags: ,,

ESRI 2008 UC: Web Application Interface Design: The Design Process

Sorry, but this unfortunately was a real waste of time.  I only stayed 20 minutes and that’s because I was answering work email during the presentation.

The presenter had just a few slides each with a single word such as Research, Development, Production and he explained what he does in these phases.  It was the standard fair such as user interviews during the research phase, testing on equipment that models the user’s environment during development, etc.  There were no easy bullet points and there wasn’t any new knowledge at this point.  I didn’t see anything to do with the UI, but then again to be fair, I left early.  There really wasn’t any content of note unless you were an absolute novice in developing a webapp project for a client.

It may have gotten better after I left but I ended up jumping into the Working with GIS Services – The Developer’s View workshop.  Boy this is the one I should have been in!

ESRI 2008 UC: Advanced Geocoding

There are a few new features in geocoding in 9.3 but there are also several other fundamental tools I didn’t know about.  So I’ll just go over a few of the options that are pretty great.

I came into our GIS system 2 and a half years ago and it was handed to me the same way it was handed to the last guy.  I got the software, tools and shown the ESRI site with tutorials.  What this means is there are a lot of fundamental practices and features that I simply don’t know about.  It’s sometimes hard to find a better way to work in GIS because you’ve invested time in learning the current process and there is so much data online it’s hard to pinpoint exactly what works for you.

Here are a few features that I will definitely take advantage of:

  • Composite Locators
    Composite Locators are simply an ordered list of locators.  For instance, we purchased parcel data from Riverside County.  We also have a street layer.  I can create a locator for each and then add both of these to a Composite Locator.  Composite locators allow me to tell ArcGIS to match against the parcel locator first and if a good enough match is not found then match against the street layer.
    When viewing the geocoded data a new field is added showing which locator was ultimately used to match against.  This is a good place to look if you start getting a lot of false positives.  If your match score threshold is too low for the parcel locator ArcGIS might be grabbing a “close enough” parcel when it could have gotten an exact match against the street layer.
    When using Reverse Geocoding the data in the tool tip is based on the current locator being used.

    • If you are using a street locator then as you move your mouse you will see the address numbers change based on the distance along the street.
    • If you are using a parcel locator there is no range.  You will see the address numbers jump from one parcel to the next as you move your mouse.
    • If you are using a composite locator the tooltip will show which locator it is using to get the data.  This is pretty sweet

Composite Locators will also come into play in point feature classes for geocoding below.

  • Matching Addresses Interactively
    When you have some addresses that couldn’t be matched (most likely due to incorrect address data or incomplete reference data) you have the option to match these interactively.  During this process ArcMap will show a dialog that lets you manually process each address individually.
    Ideally you would correct your source data (the input addresses) or update your reference data (the parcel or street data).  This way you could automate your geocoding process.  Manually matching addresses can be a time consuming process, and most likely you will have to do it each time you re-geocode your addresses.  Read on to find out how to permanently save your manually matched addresses.
    However, the interactive matching tool is great for pinpointing why an address cannot be located.
    ArcGIS 9.3 offers two new ways to match addresses.  You can now reverse geocode a point, effectively telling ArcGIS, “I know where the address lies on that street.”  This associates the address with that physical location along the street. This is Picked by Address (PA is the MatchType in the underlying data).
    Conversely say you know where on the map the address is located, but it’s not necessarily along a specific street (such as a new development and you do not have streets for the area yet).  You can specify a point on the map and this is known as Picked by Point (PP).
    So the big question, now that I’ve told ArcGIS where these addresses are, how do I make it remember this on the next geocoding?
    You can save these points into a feature class.  Then create a Locator based on this feature class (a point locator).  Then add this locator to your composite locator as mentioned above.  That way ArcGIS will always find exactly where these “manually matched” addresses are located.

Unfortunately there were no demos on actually creating a locator based on a polygon.  I’ll have to play around with this but it sounded like it would be fairly easy to implement.

I’ll try and find some links to the MIT group that demoed during the plenary session.  This was outrageous.  The used 3D Analyst and literally created 3D objects for every single room on the MIT campus.  They then built a locator around these layers and allowed the user to locate a building and room as simply as they would enter an address.  Very nice!  The demo went way beyond that but that’s the extent for using locators.  I would love to develop a 3D interactive map for helping users (and possibly EMS services) to find exact locations on our campuses such as the office, a specific room, the library, etc. However I don’t have a group of students working on their senior project mapping our campus.  🙂  That’s an idea though!

Technorati Tags: ,,

ESRI 2008 UC: Plenary Session

The main presentation by Jack Dangermond was pretty good, as was last year’s.  It’s a good mix of what is new in ArcGIS 9.3, what users are doing in the field, what’s coming on the horizon, and an overall impact of GIS in the world. 

From a technical point of view here are just a few of the new features in 9.3.  There are way too many to list so take a look at the ESRI site if you want a comprehensive list.

  • Reverse Geocoding
    This has been a long time coming and a hot item on the request.  This wouldn’t warrant too much discussion, except that the implementation was really well done.  Typically (from my point of view) ESRI is a functional program, but  wouldn’t win any UI or productivity awards for most of it’s features.  The buttons or dialogs for typical tasks aren’t always where you expect them and sometimes you have to drill down through 10 screens just to get to your data. 
    However, with the reverse geocoding, it was a very easy to use tool with a crosshair for your cursor and a small dot that would snap to your street network nearest to your cursor.  Clicking would very quickly give you a geocoded address at that point.  If you clicked on an intersection the geocoding would be the intersection of those streets.
    These reverse geocoded addresses could be saved as pushpins which can later be saved as features I believe.  Very nice!
    • Oh, I just stepped into the Geocoding Intro technical workshop.  Now when you run a geocoding process, if there are any addresses that could not be geocoded, in addition to the standard options to fix this you can now use Pick Location.  This allows you to use the Reverse Geocoding tool to pinpoint exactly where on the map this address should be.  This is great as it might be difficult, if impossible, to change the address in the source data.
  • KML Export
    As the rest of the world jumps online ESRI, arguably the largest provider in the GIS arena, has been a little behind the game.  In the past I have had to resort to 3rd party tools to export our map data to KML.  Invariably this also requires lots of massaging of the data afterwards before it is ready to be published on an online mapping service such as Google. 
    You can see an example of our school district layers pushed to Google through KML here.
    Now ArcGIS will have native KML export built in.  When used in conjunction with ArcGIS Server and other tools this will make offering your GIS data to online mapping systems a very easy process, that will free up maintenance and always hit live data.
  • PDF Support
    For a while now you’ve been able to export GIS maps as PDF.  This is a great feature as ArcGIS Desktop will also export the text as well which is completely searchable and selectable using Acrobat Reader.  I use this all the time when exporting maps of our district.  It’s amazing when I have several hundred streets on a map, go to the Acrobat Reader search box, type in a street name and find it in an instant on a map.  This is really useful when other users download our maps and want to find where they live.  We have an online School Locator tool, however, having a map on your local machine is a great tool for use in offline scenarios. 
    However, other than this ability the PDF version of the map has still been fairly static.  ESRI has been working with Adobe to really exploit the abilities of Reader.  Now you can export a wealth of data to PDF.  This includes data frames, layers and feature attributes.  In the PDF hierarchy you can now see the individual data frames and layers.  When clicking on a feature you can get all the underlying data for that feature.  This is just like using the Info tool in ArcMap.  Also, the data can be georeferenced.  This allows a user to get X,Y coordinates from any area of the map.  There is no geocoding yet, but tis is all pretty neat. 
    This is pretty amazing because now you can get an incredible amount of information just from an offline PDF.  This is not only useful for Internet connected machines.  As more and more users are using mobile devices that may not have direct connection to an online GIS service, having a PDF they can use with this info will be a great step forward short of building an offline app.
  • Virtual Earth Integration
    They went through this area pretty fast so I didn’t get all the details.  It seems that you can pull VE services and resources directly into ArcGIS Desktop now and use in your own maps.  This means that you have full access to the imagery and data.  This is all on demand, which means that you cannot store the resources for your own editing or offline use.  However, this also means that you will always have the latest data.  When you open a map it will retrieve the latest images, including any new ones Microsoft may have published, directly in your maps.  This can offer a wealth of data if you have out of date or no imagery/data for your map content.
    I assume that Google and other map services will be accessible as well, it’s just that ESRI kept touting it’s partnership with Microsoft so I’m a little hesitant to say this.
  • JavaScript API
    This has been a sore point with ArcGIS in the past few years.  As I said above, ESRI has really been playing catchup.  Most of ESRI’s online mapping products have been pretty bad.  The UI design wasn’t great and it was terribly slow.
    I don’t know what the current tools are like (and usually ERSI demos are always running in a perfect world) but ESRI is starting to allow more options to connecting with data.  One of these is the JavaScript API.
    This API, on the surface, seems pretty similar to Google or Microsoft, where you specify a JavaScript file, the resource data and a div to place the contents into.
    When you publish a map to the ArcGIS Server there are now several default options to consume the data.  When you go to the url ArcGIS Server now allows you to open the map in ArcMap, view in the internal viewer, and view using the JavaScript API among others (KML export possibly but not sure).  If you choose the JavaScript API option a new page is opened with a standard web 2.0 map using the ESRI API.  If you view the source you can see that there are only about 10 lines of code that actually retrieve and display the content.  If you simply copy this text you can paste this into your  own apps and very easily add your interactive map resource to your pages.  Pretty nice indeed!
    I have to laugh here because the ESRI rep demoing this function turned a static (and very bad looking) jpeg of a campus map into a fully GIS capable interactive map in about 1 minute.  The crowd cheered.  :)  As any HTML/JavaScript developer might know there are a lot of underlying things being assumed, the first gotcha being make sure your div is properly named either in your DOM or in the JavaScript code referencing the ESRI map resource.  This is of little worry for developers who understand what’s going on, but I know there will be a few business users going back to their organizations saying "Do this, it only takes 1 minute!" and their non-web savvy GIS engineer will be spending a day on it.
    Eh, maybe I’m just pessimistic but you  can see the marketing "woohoo!" all over these demos.  ESRI always operates their demos in the perfect world.  But so does everyone else (i.e. Microsoft). 🙂
  • Mashups
    OK, if you are a web developer and haven’t been in a coma for the past few years, you should know what a mashup is.  In a nutshell a mashup is simply a web page that takes data from one source (i.e. Flickr photos), combine it with another source (i.e. Google Maps) and display the results (i.e. showing where the photos were taken on a map).
    John Grayson from ESRI’s Applications Prototype Laboratory created a great tutorial with 7 different examples of creating mashups using ESRI data and JavaScript API’s.  Each one increases in it’s level of capability and complexity.  Unfortunately all the examples were based on retrievinganalyzing data and not on editing actual data for updating on the server.
    I can’t seem to find these slides or any information on John’s presentation anywhere so hopefully he will publish these soon.  Otherwise in my spare time maybe I can throw a few together.  (Yeah, when do I have spare time!  I stayed up to almost 4am last night!)

Overall it was a great session.

I’ll be adding more posts throughout the conference on anything I see that’s noteworthy.  Those will hopefully be a much shorter read!

ESRI 2008 UC: Start up

Well, I’m here at the ESRI User Conference in San Diego, CA.

I’m the GIS admin at Val Verde USD and this is the 2nd time I’ve been here.  I have to say, that this is one of the best conferences I’ve been to.  Not only in content, but the actual logistics of the entire event.  The San Diego Convention Center is a great place, very easy to get around, clean, with lots of nearby restaurants of any food type.  Just the shear amount of people in attendance is a logistical nightmare, but everything is always very orderly and well handled.  Definitely a pleasure to attend.

Anyway, I’ll be writing a few posts here and there about what I think is noteworthy in the conference.

I’m a developer at heart and GIS is just an incredible extension of visualizing data.  That being said, I’m not really the green bio-loving attendee that GIS usually caters too. 🙂  I love the geeky stuff, so I’ll be mainly focusing on the new 9.3 server, SQL 2008 integration and automated development using .Net.  If you want to find out how to map the migration of the Blue Morpho Butterfly or track the degradation of the rainforests over the past 50 years then this may not be the blog for you. 🙂  however, if you’re interested in .Net and GIS then stop by every so often and see if I have anything new for you.

So far I’m in the Plenary session waiting for the main event to start.  The music started out as a nice techno energetic beat, however, now it’s getting a little aboriginal.  That’s not a bad thing, it’s just a little over played.  OK, now we’re into more Last of Mohican’s style, which is definitely a great sound track if you haven’t heard it.  OK, I guess that’s potentially aboriginal as well, but it’s definitely more grounded then the wooden flute stuff that was going on earlier.

OK, I’ll stop.  🙂  I’ll post again when something of actual content comes up.

Take Care!

“We have secretly replaced this user’s operating system with Vista. Let’s see what happens…”

It looks like Microsoft took a tip out of Folger’s marketing book.  🙂

This was pretty smart actually.  They took a bunch of Vista skeptics and asked them on video what they didn’t like about Vista.  MS got an earful.  Then Microsoft immediately showed them the new O/S, "Mojave". Most of the users were impressed and liked the direction MS was going.  Then the punch line: Mojave is Vista.

Once the users realized they were actually using Vista it helped to break down some of the negative feelings they had.

It’s a great idea, now Microsoft just has to figure out how to market this exact result to the masses.

 

As for me, I’m still an XP user.  I don’t hate Vista, it’s just that I see no reason to use it.  Upgrading from Win98 to W2k was an obvious boost in features and administration.  Win2k to XP (especially when SP2 came out) was also an obvious choice.  I just don’t see it with Vista.

I love flashy O/S’s so the UI really appeals to me.  But I’m also a speed freak.  I love my machines running at top performance and getting my work done.  I have yet to see a really nicely running Vista machine that takes as few clicks as XP does.  Granted XP was no speed demon when it came out on hardware over 5 years ago, but it just runs so well now that it’s hard to move unless I’m really getting a bundle of new features that affect my day to day work. 

Give me my quad-core with 4GB of ram and throw XP on it please. I like VS 2008 loading up in a few seconds.  🙂

Microsoft looks to ‘Mojave’ to revive Vista’s image | Beyond Binary – A blog by Ina Fried – CNET News.com

 

Technorati Tags:

iPhone 3G – Almost there

OK, I’m a die-hard techy and the iPhone is just sexy. :)  I don’t have one because I’m also a family man.  The wife tends to get a little upset when I buy "toys" and starve the kiddies.  🙂

This article is a fairly good review of the current state of the iPhone. 

PC World – First Reviews: iPhone 3G Improved, but Still Flawed

Let’s keep one thing in perspective.  The iPhone has been out less than a year.  Even with the criticism, I still think it’s one of the best featured phones on the market compared to it being less than a year old and a version 2.0 status.

Of the current "flaws" it has, really only one is a long standing problem and that is the 3G coverage from AT&T.  That’s really up to AT&T to fix.  I really hope they are working on this quickly.  The cost is also something to work out, but I don’t really see it much different than other top selling smart phones.

The missing features  are simply a lack in the software such as no video capture,  no voice dialing, etc.  These are all things that can easily be solved by a future update from Apple.  With Apple’s track record I have no doubt that all these features (and plenty more) will be added before too long.

There is a rumor that next year there might be a slide out keyboard.  This may not be as thin and sexy as the current iPhone, but you have to admit, that’s what much of the business sector is waiting for.  It’s what I want.  Research is showing that most business users don’t want to give up their built-in keypads hoping the iPhone will work out for them.  Even though the iPhone has a great onscreen keyboard, it still isn’t an actual keyboard with individual buttons.  Maybe when the slide out keyboard comes along, the business users will jump on it, and then end up just using the on screen keyboard.  But they need the actual keyboard to help encourage the jump.  Think of it as the tipping point.

As new versions come out (probably at least one a year for now like the iPod) it will just get better and better.  The battery life will improve, the features will increase, etc.

I can’t wait till I finally get my hands on one.  🙂

 

Technorati Tags:

Why Free Training Is Good

Back in the good old days, when MS Office simply was Word, Excel, PowerPoint and Access, we all knew what Office was and why we needed it.  It was similar with other market leading programs such as WordPerfect and Lotus 123, Photoshop, QuickBooks or PeachTree, etc.

By my list of software above you may be trying to figure out what I mean by "way back when".  Let’s pinpoint this time period as cerca 1995.

The world was great.  Well, not great, but it was less confusing. :)  For the general consumer there was only a few operating systems to choose from, one or two office packages, etc.

Back then we knew we needed Microsoft Word or Word Perfect.  We knew we needed Excel or Lotus 123.  If we wanted to know how to use it we bought a book, took a class, or (gasp!) read the manual.  Yeah, manuals were actual paper books bundled with the software back then too.  This meant that companies could charge for in-depth books and training programs for their software.  It was the "if you build it people will pay to learn how to use it" era.

Now come back to the present.  It’s a lot more complicated.  Do I use Office Home and Student, Standard, Small Business, Professional or Ultimate?  What about this Open Office thing I keep hearing about?  What in the world is Groove, InfoPath, OneNote and why do I care?  People keep telling me that OneNote changed their entire work process.  How?

Notice I’m just sticking to the MS family (except for the Open Office reference I threw in there).  It seems like with every new version Microsoft throws a new product or two into the Office family.  If I were to list all the non-MS alternatives this post would be a mile long.

So, what’s Microsoft’s answer?  As you probably have noticed much of the Office website (and really any major Microsoft product site these days) is mostly devoted to helping you understand what each product does and how to use it.  It’s still pretty confusing, but there is a wealth of how-to articles, videos, webinars, trial demos and sometimes even online interactive demos.  All free.

It’s not like the world woke up and suddenly Microsoft realized they needed to convince us why to buy their product.  That has always been the case.  It’s just now they are freely telling us how to actually use it.  That’s great. 

This is definitely more of an industry trend.  Most software companies offer free product demos, videos and training on their website.  In fact, if a new product can’t show me in a 5 minute video why I would want to use their software I tend to move on.

It’s nice to be able to spend a half an hour in the morning beefing up on a new product.  I’m trying to set this as a weekly (perhaps daily) habit.  Currently I’m a OneNote training junkie.  🙂

ESRI’s ArcGIS still in the 80’s

This is an argument I’ve had ever since using the ESRI products almost 3 years ago.

ESRI by far is the leader in GIS software. They literally helped to make the industry what it is today.

Unfortunately a lot of the code base still is back in the stone-age (technically speaking).

The ESRI products are your typical “kitchen sink” set of applications. They do anything and everything for everyone. The software we use at our school district is the exact same software used by small business, big business, emergency services, utility services and military. The only differentiation is our data and plug-ins.

That’s nice in one sense. I really like having the power of the “big guys” at my finger tips when I want to use it. However, that’s a problem for customer service and developer support. ESRI makes money by developing the new hot features requested by their customers. When Google Maps came on the scene ESRI had to play catch-up really quick to offer an AJAX ready online product. They still haven’t met the mark but they’re trying.

Anyway, the problem is that no one (AFAIK) is maintaining the old code base. This means that the old bugs and usability issues are going untouched. Unless there is a bug or UI problem that is really significant it is put on the “do later” pile.

This is most evident in performance. We are getting ready to install ArcGIS 9.3 on a new set of servers when it is released in the next month or so. So at the last User Conference in San Diego I asked some of the ESRI techs what our server specs should be. They replied “lots and lots of RAM and a high-end CPU”. Notice that CPU was singular. The ESRI code base has been untouched other than bug fixes for years. Even when they moved to higher precision data storage in 9.2 they didn’t go back and update any of the original foundation code. The tech confirmed that ArcGIS products take no advantage of multiple cores or multiple CPUs at all. Ugh!

There is some performance gains if you use a SQL backend for your data or through your web server since these do have multiprocessor support. But as for actual GIS processing, no such luck.

This last comment is helping to fuel our interest in SQL 2008. If you know anything about me, then you probably know I’ve been playing with the new spatial features of SQL 2008 for several months now. ESRI is going to support SQL 2008 when it launches. Hopefully this means that ESRI will be pushing a lot of the processing work back onto the native SQL platform rather than on my desktop.

We’ll see and I’ll keep you updated as we progress.

If you’re going to the ESRI User Conference this year drop me an email and we can meet up.

ReSharper 4.0 Released!

It’s official!!!!  ReSharper 4.0 has just been released.

This is by far one of the best (and coolest) tools available for Visual Studio.  If you have never heard of, or had a chance to experience, ReSharper then you owe it to yourself to click the link below and check it out.  This tool has made me a better developer both in quality and performance.  I’m definitely a student of TDD, refactoring, patterns, etc.  ReSharper makes using these practices so incredibly easy it just becomes a part of your work.

ReSharper is one of those tools that when you work on someone else’s machine you feel like you’re in the stone age.  Or if you see a presentation and the presenter doesn’t have ReSharper you constantly think "Come on man!  I could have coded that in half the time!"

There’s no way to do it justice here.  Just check it out.

I’ve been using JetBrains software since IntelliJ back when I was a Java programmer.  Every Java dev has their favorite IDE and IntelliJ was mine.  I was sold the instant I ran it.  When I moved over to .Net, honestly, one of the biggest sore points was loosing IntelliJ.  Now with VS 2008 and ReSharper I feel at home again.

Boy, this really sounded like a commercial, huh?  sorry about that.  I’ll try and keep it to a minimum next time.  But if you’re a ReSharper user already I’m sure you understand. 🙂

ReSharper:: The Most Intelligent Add-In To Visual Studio

 

Technorati Tags: ,