Maintain Your Product’s Relevance with a Data Management Strategy

By: Robert Fleming

data management

It is that time of year again; the Esri UC is behind us and we GIS folks have all these great ideas to chase down.  Maybe you saw a cool map in the gallery that highlights data in a unique way; or went to a session about an application that your users could really benefit from.  Whatever it is, the fire has been lit and we are ready to run with it.  Great!  That is what these events are all about – sharing ideas, learning about new techniques, and finding solutions.  Now hit the brakes…

All too often, we come out running but we often forget to acknowledge that we just focused on the final product.  That map you came upon was not just a few hours of work to make it look great on some glossy paper.  That application was not just a few developers and designers making a great interface for the public.  It represents weeks, months, or even years of building great data that analysts and developers expose through their products.  Don’t get me wrong, the talents witnessed at these events are amazing; and we need great maps and apps to share our data.  However, if the data is no good, who will want to use our maps and apps?  What value can our products have then?

As you develop your next project, spend some time thinking about the data.  Plan to foster an environment where data management is key to the strategy.  How will it be stored?  How will it be maintained?  How will you ensure its quality?  These are just a few thoughts to get started thinking about data management strategy and how it can make your product relevant.

Data Standards

All projects start here; and while defining a database design is rarely left out, it is important to get this step right in the beginning.  What data model should be used; is the existing model good enough or should a new industry model be implemented?  This answer may vary depending on your organization and project needs.  It is importance to understand the end goal, which will help define the requirements and drive the database design.  Designing your database correctly early on will ensure all relevant data has a place in the model.  There is nothing worse than having to review every record over again because you had to add one more attribute.

Data Maintenance

Many projects use the most current data or collect new data once during the project scope.  This might suffice for the initial deliverable; but the data becomes stale the day it is published.  From that day forward, the value of the product begins to decrease.  How can you keep your product relevant?  Build data maintenance into the existing business processes.  The staff from the group you built the product for often completes these processes.  Be sure to include this group as a key stakeholder in your project.  Get their buy-in and discuss altering their business processes to include a GIS data maintenance step.  If you help them realize the value of your product, this will be a welcome addition to their processes.

Data Quality

Designing a great data model and keeping your database current is great.  These are big steps in making your data relevant.  If it meets their needs, people will want to use your product and rely on it for answers.  What happens when they find errors in the data?  Errors are going to happen; there is no way around it because we all make mistakes.  How you handle the errors will define your data’s relevance and possibly even make it the authoritative source.  Plan for regular quality control checks to ensure it meets a defined standard.  These should include both automated checks for schema discrepancies and hands-on checks to verify content quality.  You may also include a method for users to submit errors they have found.  Implementing data quality will help establish your dataset as a trusted source.

So if your data is incomplete, outdated or full of errors your product will not be relevant no matter how great the idea was.  Remember, when people use your product it is the data and the story you can tell with it that they are truly interested in.  Make data a priority and stay relevant.


Mowing your Own Lawn

By: Chris Fricke, Solutions Engineer

“They say that something like 90% of the world’s information is spatial.”

This is usually the quote I use when I feel I need to explain to people what I do for a living.  I then go into how I help municipalities and local governments across the country develop and publish their data out to their local constituents.  It makes everyone feel really good and usually distracts from any Buster Bluth references.

The theory that most data is spatial has opened up new doors across business units in local government.  Focused apps like the My Government Services, Tax Parcel Viewer and Executive Dashboard have set an expectation that not only will GIS applications display where, but that they will also display what, why, and how.

However, I feel like the quote is missing a key element, 90% of the world’s information is not spatial- it is spatially related.   Case in point:  the assessor information joined to your tax lots in the Tax Parcel Viewer is not spatial data, it is business data and it probably should not be managed by the GIS folks.

For this reason I would like to introduce the concept of Mowing Your Own Lawn.

Mowing your own lawn is a strategy for keeping data stewardship with the departments that own the data. Instead of maintaining tons of attribute information, GIS should instead focus on just the spatial component.  GIS should then provide a mechanism for non-GIS folks to maintain their attribute information or link to external resources.

Here are a few tips and tricks for mowing your own lawn:

Directly pull from other business systems

At ArcMap 10.0 Esri introduced query layers and at 10.1 they made tables with native geometry types pretty transparent to the end user.  This means that any data that has an X and a Y can be pulled directly from that source.  Instead of duplicating 911 or other incident data into your GIS database, now you can simply create a query layer in ArcMap referencing the other database.  This prevents GIS from maintaining security on data that should be maintained by E911.


Join to other business systems

Assessing data is always a huge pain.  Usually by the time the data makes it through the extract, transform and load (ETL) process into the GIS data, it is already out of date!  Then the data might sit there for a week until the next ETL process updates the Tax Parcel feature class.

Wouldn’t it be great to create a live link to the assessor’s database with a spatial view to represent the join between Assessing data and spatial data?  You can do this incredibly easily at 10.1 with the ability to create Spatial Views through ArcCatalog.


Create focused attribute editor apps

Often the data we would like to display simply does not exist in a format GIS can use.  This calls for getting the folks with domain expertise to populate the data for us. A focused application that only allows for users to edit attribute information is a straight forward method for collecting this information without a ton of training.

Need someone to fill out Park information?  Simply roll out a simple attribute editing web app that allows park personnel to fill out what amenities are available at the park and what time the parks are open.  BAM!  It automagically updates in the locater app open to the public.


So yes, 90% of the world’s information is spatial, but it doesn’t mean we have to maintain it all!

Gettin’ Yammered!

By: Chris Judd

Here at GISi, we get Yammered.  Some of us even do it during work hours – and it’s not against company policy, in fact it is encouraged.  As a part of our commitment to Service Excellence, we need to communicate and let each other know when we are doing well.  This is premise behind my latest and first project here at the company, the Peer Recognition System.

Yammer Look and Functionality

Yammer describes itself as “a private social network for your company”.  It can be thought of as having the layout of Facebook and the following feature of Twitter but with the universe of people limited to just a specific company.  As you can see from the screenshot below, the GISi Yammer page has a strong resemblance to Facebook, sporting the similar blue and white color scheme:


Yammer as an Organizational Development Tool

A central goal of GISi is “To Be the Premier GIS Firm to Work For and With”.  As a company and as employees we are committed to our organizational development.  A part of this strategy is to facilitate feedback, not just up and down the management chain – but from peer to peer.  This recognition and acknowledgement of our good works goes a long way to build and improve relationships inside the company.  A couple years back, Yammer introduced their “Praise” feature for employees to recognize individuals or groups of people in the company. The praise interface is very similar to how one would post a normal status update on Yammer.  You input the person or persons you want to praise, add a little note about why you are praising them, and assign the topics/hash tags to list specific skills or products you are praising them about.


The Peer Recognition System is Born

The employees in our company started using the praise feature immediately after it was released!  It was decided that we needed to aggregate all of this good Karma so it could be tallied and reported in an analytical way.  This is where I came in.  I was given the assignment to gather up all of the kudos/Karma activity and harvest it to be displayed on the employee’s profile page on our company intranet.

Putting the technology together really wasn’t very difficult thanks to Yammer’s Rest API.  Here the steps that I took:

  • Everything starts after you register a new application by going here:
  • yammer3

  • Get authenticated: Yammer has a very detailed tutorial on using OAuth 2.0 to get access to the data.
  • Use the Yammer Rest API to cull out the praise data.
  • yammer4

  • Create a database structure to persist the Yammer data.
  • Then I created a simple ASP.Net web page to display the data that I collected.
  • Used the Content Editor Web Part (CEWP) to insert the simple ASP.Net page in SharePoint as an iframe.

We broke down the data by the topic (or hash tag) that a person was praised for, inserted their praiser’s profile picture, added a count of the Karma that has been given and received – called the Karma Counter.

Final Product

Now it’s easy to see who has a stash of Karma in the company.  We can just browse over our company intranet profile and admire all of the good Karma we have built up or given out.  Check out the final results, looks pretty good, don’t you think?


Esri Dev Summit Wrap-Up

By: Dan Levine

Now that we have had the better part of a week to decompress from the Developer Summit and Partner Conference, I wanted to provide some final thoughts from our GISi team that attended. Even though I am more of a Jimmy Fallon fan, I will use the Lettermen Top Ten format to do this (I am not quite creative enough to put these into a Thank You note format).

1. Emphasis on Design UI/UX

This topic was introduced by the Keynote Speaker Jared M. Spool, who was hilarious while presenting a great topic. You can check out the video here.

As we transition from developing for the GIS User to the non GIS User – the user experience expectations have changed and we need to be more conscious about how we go about meeting those. Spool’s talk gave us a framework in which to do that. The UI/UX theme was scattered throughout a number of “best practices“ technical sessions and it had even seemed to spread into a best practice for code structure. All of Esri’s code that you saw was well written, organized, and followed best practices. Even in several dumbed-down examples, they still followed best practices even if it added a few lines of code.

2. Esri Stack Alignment – the Platform

OK, we get it, it’s a platform. Yay it’s about time. Seriously though, it is quite refreshing that the Esri stack now really all looks like the same software and it is up to industry standard specs too! That’s why we can call it a platform. You could see Esri building towards this over that last few years and it clearly didn’t happen without some pain and effort on their part. Way to go product teams and visionaries at Esri!

3. On a related note – New software Integration into the Stack

Esri has definitely gotten much better at integrating software that they have acquired. Note the GeoEvent Processor, Esri Maps for IBM Cognos, GeoTriggers, and CityEngine to name a few. It used to take years for Esri to truly integrate newly acquired software (remember Schematics). Now, outwardly at least, there seems to be process and consistency to incorporating these new components quickly into the stack. The fact that the stack has matured surely makes this easier. I am anxious now to see how they do with the developing new 3D capabilities with several teams scattered across the globe.

4. AGOL is getting pretty Deep

The central component to the Esri stack, AGOL (and portal for organizations) is getting richer by the day. An incredible investment in the types of data becoming available – increasing resolutions of imagery and elevation data, the effort to consolidate all of the natural resources data in the Landscape Services are just a few examples. Additionally, the application side is also getting deeper and wider. Addition of the GeoEnrichment component, exposing BA data/reports into AGOL is a nice touch. More importantly, adding SAML2 to the security component makes solutions built with AGOL more palatable to organizations that require Single Sign On – SAML2 lets you leverage the corporate AD/LDAP security infrastructure instead of requiring you to use your Esri Global Account credentials. Leveraging this capability is being pushed out to all of the SDKs that would tap into AGOL. Esri is still working out the business models and necessary infrastructure to allow us to develop apps that leverage AGOL and create some sort of charge back ability. How are AGOL credits accounted for? Adding an application ID tag and an AGOL Organizational ID tag is going to help. Two current models leading the way are the Apps for Organization, where you publish and application and with the app ID you can track how many credits that application uses. The next step would be to have a User ID so you could track at the user level who is using a particular app and how many credits they are using. The other model is more of a consumer type of solution. We create an application and make it available to the general public. The owner of the application pays for the credit usage and it is up to them if and how they charge the users for the credits used. Esri is figuring this out, as it will be key to rolling out the MarketPlace.

5. The “Geo-” prefix is still cool…but can lead to Geo-Confusion

Like what is the difference between GeoEvents vs GeoTriggers – stand by for a geo-whitepaper – Amber Case promised. There had been some movement away from using GEO that started last year as part of the Location Analytics Team and for good reason. These are the guys that are building the solutions that are embedded into the larger Enterprise IT stack and the C-level guys that manage those don’t generally speak Geo. But there seems to be this yin and yang going on within Esri about the terms. You could see it in the session titles: “Unleash the Power of Mobile location in Your Application”, “GeoTriggers” – basically was the same presentation by the same people. Conclusion: Geo is still cool, just be careful what you say around Art Hadaad.

6. ArcObjects is NOT going away.

So last year the thing was the Run Times. Everyone was trying to get up to speed on the run-times, learn about the architectures, what functionality it had and would have, etc. It seemed like we were being encouraged to move away from ArcObjects and jump on the RunTime Bus. By the end of the week, many of us started wondering what was going to happen to ArcObjects. Was RunTime Replacing ArcOjbects? How long did we have to migrate our apps? Finally, during the closing lunch Q&A someone asked. Well Jim McKinney and Scott Morehouse didn’t make us feel any better. They clearly weren’t prepared for the question and maybe the answer they gave was what they were thinking at the time. Yes, RunTime will be replacing ArcObjects but it will take several years (paraphrasing). Well this year both Jim and Scott were ready for the very same question. Answer, unequivocally “ArcObjects is NOT going away.” It is at the heart of the most robust desktop GIS Software in the world and will continue to be. Thanks for the clarity! I heard Neil Clemmons give a sigh of relief all the way back in Birmingham.

7. Run-Time Disconnected Editing is Almost Here

Many of us are being asked to develop mobile data collection apps that can stand being disconnected for periods of time. The run-time SDKs currently don’t support that workflow inherently. Some of us have built “work-around” solutions that are specific to our particular use cases, but the core products don’t have this capability. It’s coming with the 10.2 release with the “Synchronization Engine”. That was the message on the plenary and in the tech session. But…you kind of got the feeling that it might not quite be ready with the initial release of 10.2. Fear or uncertainty in the eyes of the product teams, I don’t know but my spidy senses were going off. It feels like one of those things that will be in the first post release soon after the main 10.2 release. That’s fine by me, just keep us informed of what is realistic and get it right before you get it out. This is a huge deal for many of us.

8. ArcScripts is Back … Can you say GitHub

Social coding is permeating GIS development. The ability to share and contribute to OSS is becoming more and more mainstream. Esri’s adoption of Git and GitHub as their tools of choice for sharing source code for certain solutions strengthen the importance, flexibility, and capability of social coding in general and Git and GitHub in particular. This is pretty important for those of us extending the starting point solutions that Esri publishes. It will allow us to create a branch ourselves and add our secret sauce and then opt or not to share it back to the community. Don’t expect everything to be shared though, for instance, the Military Solutions team has already posted many of their solutions source code, but the simply can’t do that for all of their solutions due to ITAR issues. In any event, this is a very nice use of the mainstream technology for sharing solutions in a really useful way. We are currently evaluating what we use for our own internal source control and this GitHub movement by Esri may now influence what solution we decide to go with.

9. ArcGIS Desktop integrated viewers and multiple Layouts almost here, that’s crazy talk.

Welcome back the ability to have multiple layouts in a project, a long time coming. But we still have a while to wait. Promising to be in the 11 release (maybe this calendar year) we will be able to have multiple 2D and 3D viewers in the same application. This has been something we have had to customize for several clients. Now we can plan to retire the custom solutions in lieu of Out-of-the-box capability and spend our time on what to do with that information.

10. Dodge ball is still the big draw.

I don’t know about the rest of you but is there anything else going on at the Wednesday night social? This year they had to go to 3 simultaneous courts. We fielded two teams but still couldn’t get past the quarter finals. Dang. I thought sure that at least we would win the best uniforms. You tell me. The winning team: Bright Orange Tees with the words Flying Dodgmen. Ok they were a Dutch team and wearing the country’s colors. We had Bright Orange Tees (road-worker orange) with zombies on the back. Waaay better graphics! What gives? I am starting to think there is a dodge ball conspiracy against GISi after we took out one of the Esri teams in the first round last year. Come on man!

GISi Dodgeball
In conclusion

This has always been, by far, the best technical “show” that Esri does and I evangelize attendance to our staff and anyone else’s staff for that matter. That said, this year it felt like it dropped off a bit in technical depth. Not sure that I can tie anything specific to this sense, but our entire team felt it and I had similar comments in a number of conversations with other attendees that aren’t part of the GISi family. Some of the comments were about the fact that a lot of it felt “salesy” or they were trying to push the Platform message across the board so much that there wasn’t as much Deep Dive into the code. Usually a Tech Session presentation would start with a Title Slide and then we would be looking at code. This year it was Title Slide, How X fits into the Platform, a reminder what the platform is, where you can find this on GitHub and the site, then a bit of code. I get that the messaging is important and that there is a shift in the Esri philosophy on what they are creating and promoting and even who they are promoting it to and that that needs to be communicated. All that aside, still I am talking a B for the show versus a consistent A to A+ that it scored in the past. We will be there next year, maybe with enough to field 3 dodge ball teams.

Leap(Motion) Into ArcGIS

by: Christopher Bupp

Esri is hosting a 100 Lines or Less ArcGIS JavaScript Code Challenge … and while that is a mouthful to say, the important word is: “Challenge.” My response of course is: “Accepted.”

For the last few months I’ve been tinkering with a developer model of the Leap Motion device. So for my submission, I decided to put a couple awesome things together.

I experimented with several gestures to interact with the map, but I settled with the following:

  • Circle Gesture – Zoom to Extent
  • Tapping/Poking – Center the map
  • 1-2 finger swipe – Pan the map
  • 3 or more finger swipe – Zoom out
  • Pointing at the screen – display point location with latitude and longitude.

The latest versions of the Leap Motion JS API are pretty good at recognizing gestures. The tapping/poking seems to be the hardest one to recognize. The circle gesture and swiping are almost always recognized.

The current version of the Leap Motion JS API doesn’t recognize a skeleton. So fingers tend to get merged and disappear from view. I’ve heard that the next versions of the SDK will be adding this functionality.

Something that the Leap Motion JS API doesn’t yet support is screen location. So, I used 27 lines (out of 100) to create a “very basic” calibration. The calibration assumes that the leap motion is below and parallel with the screen. A really cool consequence of my basic calibration is that interactions “off screen” still get translated to map points; you’re able to circle or poke off-screen, and the map will still go to the desired extent.

My code is hosted on GitHub. Here is the code break-down (out of 100 lines)

  • Setting up the map (4 lines)
  • Calibration (27 lines)
  • Detecting and drawing the finger tips with lat/lon (17 lines)
  • Detecting Gestures and interacting with the map (24 lines)
  • Creating output messages (10 lines)
  • Detecting leap motion support (2 lines)

Which means 16 lines were used for variable declaration and closing curly braces.

Since the competition ends March 28, the judges will likely not have a Leap Motion by then. I’ve created a demo video of my code in action.

If you happen to have a Leap Motion, you can download my code from GitHub and run it by simply opening /LeapIntoArcGIS/index.html in your browser (doesn’t support IE).

For anyone interested in entering the competition, simply:

  1. Fork the competition repository
  2. Make code changes
  3. Submit with a pull request.

Fresh Thoughts on Data Management

By Jonah Adkins, GISP

Over the last few years the high availability of map and geo-data services with the help of mobile technology has sparked the “geo-boom”; any and all data can be tied to location using a variety of technologies, therefore making data management more important than ever.

GIS professionals have long been familiar with data management practices. The landfills are full of floppy disks, CD’s, and DVDs of Coverages, shapefiles, and geodatabases shared between localities, bought from vendors, or supplied by software companies. Growing up in this industry, there was a two-sided badge you got to wear as a GIS professional. The front side was a badge of honor you got for skills in “Data-Hoarding”; “How big is your database?” “Where did you get that dataset?” “You have so much data you had to buy a server!!!!” The other side of that badge said Sheriff in big letters, “Why do you need MY data?”, “It’s going to cost you”, “Can I get that formal request in triplicate?”  This badge was also a curse, because all GIS pro’s had it, so good luck getting some data from a suspicious colleague. Lost hours of phone calls with protective data mothers, answering a litany of questions, and paperwork, all to show the buildings of an adjoining county on the map. Our niche community of GIS technologists has ballooned into a billion dollar a year industry with a bevy of new catchphrases like “big data” and “location analytics”. The need for your two gigabyte geodatabase is dwindling in favor of “point me to your map service”.

To navigate the through the “geo-boom”, GIS professionals must be adaptive to a dizzying array of technologies, making managing your data so important. See the ‘your’ underlined in that sentence? Being an authoritative data source comes with responsibility. Your data is worth the time and effort to ensure it has proper formatting, free of errors, and is the reflection of your business. Can it be considered ‘authoritative’ if it doesn’t have those attributes? To curb the old “data-hoarding” habits, leave data that isn’t yours to the professionals, the ones who own it. By taking the time to research who the authoritative source for a needed data set, you can prevent costly duplication and replication of data. Why spend resources on building tile caches and storage if the authoritative source already has, and if not start a dialogue with them, chances are you are not the only one who needs it. There are countless cost effective options to make authoritative data available. While healthy protectiveness of your data is honorable, that Sheriffs badge you wear is starting to rust. You are sorely behind the curve if your data is not or if you do not currently have plans to make your data available to consumers. Potential partnerships, new consumers, and innovation are just a few of the benefits for making your data “open”. You may never fully understand the importance of your data until you make it available for all to use. Understanding that some datasets are sensitive, data security exists, and yes, some businesses would like to profit from their tireless work of creating data, these points still apply and even more so, for pay to play datasets.

Regardless of the data model, storage method, dbms, or software your data is ultimately a reflection of your business. If you cannot successfully manage your business you are doomed to fail, likewise without proper data management values, the usefulness of you data will fade. One final thought – Our niche community of GIS professionals still exists, we have endured countless changes in technology, and we will survive this “geo-boom” like the rest: as a community of professionals dedicated to putting it on a map. (Paper, digital, or otherwise)

Tell me a Better Story – The Impact of Location Analytics on Business Intelligence

By Tim Calkins, Market Manager, Financial Services


There is an old Native American Proverb that goes like this –

“Tell me a fact and I will learn.
Tell me a truth and I’ll believe.
But tell me a story and it will live in my heart forever”.

Business Intelligence has matured to be a mainstream application because it takes the two dimensional spreadsheet world and uses the data to tell a story.  Users can visualize trends, quickly picture their organization in dashboards and use indicators to traffic-light key performance metrics.

Business Intelligence has become even more effective and predictive with the use of Business Analytics, However, a chapter has been missing in the Business intelligence story, and that chapter is “Location”.  Location analytics is the next big thing in the world of BI.  In fact, a good argument could be made that location analytics enriches business intelligence they way business intelligence enhanced spreadsheets.

What is Location Analytics?

Location Analytics is the ability to draw accurate conclusions from data assembled from a variety of population and demographic data sources.  When combined with GIS (Geographical Information Systems) mapping tools, it helps organizations predict patterns and emerging demographic trends.

Location Analytics “geo-enables” data.  It is more than points on a map.  Location analytics involves spatial layers the way that OLAP cube (Online analytical processing) invokes multiple dimensions.  These spatial layers can be combined or “mashed up” so that the combination marries the data into a common framework.

LOCATION the great integrator –Data Integration Management

Data layers can come from a variety of disparate sources both internal and external.   More than 60% of data has a geo reference component and therefore easy to geo enable.   Just think about how much of your data is tied to an address or location.

An example would be customer address, layered with census data on population, income levels and age, that’s four layers right there (see Figure 1).  Now you can get answers to questions like: “What’s the average distance of customers from your point of sale by drive time or by distance?  You can also add additional layers if needed and, since the layers all deal with location, all the data integrate seamlessly.


Real World Examples

Location analytics can tackle decision support problems that business intelligence, void of analytics, would not be able to.  Here are a few examples of some of the common problems being answered with a GIS solution.

Risk Management

GISi is currently working with a financial service company to help implement location analytics. The first project was to pinpoint their customers and overlay those locations with major storm and natural disaster data.  This will allow them to avoid fraudulent claims, but more importantly reach out to customers that fall within the area of impact to provide exceptional customer service. The solution allows them to pinpoint down to a specific house or business affected.  They can now offer one-on-one assistance to their customers in the time of their greatest need. This will help them build a stronger bond with their customers. 

Other organizations use Location Intelligence to determine the risk exposure of assets, clients or facilities.  For example, they can quickly get answers to questions like:

  • What assets are located at 20 feet or less above sea level?
  • Structures that are within 25 miles of a nuclear plant?
  • How many foreclosures in a five mile radius?

This type of analysis is very difficult with standard business intelligence solutions.

Site Selection

Do you want to know where to put a store, ATM, hub or building?  Retailers and franchises have been one of the biggest proponents of GIS.   Location is a key to success or failure.  Site selection helps to visually analyze and determine the best possible locations/markets areas that would be favorable to be located based on specific conditions and variables.

GISi, in conjunction with our partner Intalytcs, has developed a custom site selection tool called SiteIntel.  It uses predictive models to estimate sales performance by location then allows the user to interact with those models in a map-based environment.  It allows a user to see (POS) Point of Sale history, demographics, site photos, aerial views, and sales forecasts, and other information such as:

  • The optimal site for the next retail/restaurant/ATM location
  • What impact the competition has on sales potential
  • Optimization (number of locations a market will support)
  • Forecasted sales for each potential location, including revenue with cannibalization (net new revenue)



This type of decision support and analytics would not be possible without Location Analytics.

Get Your Users Involved 

Now for a Chinese proverb:

Tell me and I’ll forget
Show me and I’ll remember
Involve me and I will understand.

Location Analytics engages the user and insightfully displays data concepts for better decision-making that Business Intelligence alone never could.  Location Analytics involves the user with interactive maps.  Users can pan, zoom, select, and visualize.  The maps are integrated with the tabular data as well.  Do you want to see the sales performance of a specific region on the map?  Draw a circle on the map and the application will calculate the sales data for the region selected.   These intuitive tools help your users better understand data, trends, and ultimately their business.


Most people understand the intuitive and informative nature of geospatial information.  Jack Dangermond, founder and president of Esri, states it this way, “Maps are a kind of language, we have text as a language, we have music as a language, we have mathematical languages, and we have software as a language.  Maps are a language and their power is that they communicate intuitively to people.  You can look at a map and see context as well content of a situation and the actual phenomenon that is occurring”.

It is only a matter of time before Location Analytics become widely adopted into the Business Intelligence organization.  Cognos, Micro Strategy, SAS and the other BI partners are integrating GIS directly into their platforms.  They understand this is the providing an enriched and more intuitive decision support environment and extensive visualization.

If you want your BI dashboards to tell a better story, try using Location analytics.

About the author – Tim Calkins is the Market Manager, Financial Services for GISi and their subject matter expert for business intelligence.  GISi is an award-winning GIS professional services firm located in Birmingham, Ala., with offices throughout the United States. GISi has a passion for delivering customer driven location technology solutions to federal, state and local governments, and commercial organizations.