Happy New Year and a happy #MapInfoMonday!
I decided it is time to take on a challenge that was passed on to me during a couple of our User Support Forums. This question has been asked twice and both times I said that I would write an article explaining how to do it.
Here's the question:Will you be adding a join statement to time series? So that you can colour polygons by points that are filtered by time.
I think it was phrased a bit differently the first time but the aim is still the same. The aim is to be able to calculate the number of points inside a number of polygons at a given time, and use this could in a thematic map. Basically what you see in the visualization below.
The visualization is using the lightning data that I have used in earlier articles as well. Instead of showing all the actual lightning strikes, I use hexagons to count the number of lightning strikes inside each at a given time. I use a thematic map to show this number.@Nick Hall
, here's how I did it.
Let's start with what you need to make this work.
First, you need a dataset of points holding some sort of date/time information. Well, it can also be polygon data or line data but the trick is that we will use the Within operator to see whether the data is inside of a polygon in the polygon dataset.
Secondly, you need a polygon dataset. These are the polygons in which you want to count the number of occurrences. That can be administrative boundaries, postal areas, or as in my examples hexagons. It can also be rectangular vector grids. I'd recommend using simple polygons with few nodes as that will make the spatial comparison faster.
Here you can see the input data that I am using. The point table
with the datetime column
that we will use in the time series. And the polygon table
containing the hexagons. This table has a column called
but we will not use this column.
Creating a Time Series for the Point Layer
The first step is to create a Time Series using the point data.
From the Add To Map
dropdown on the Map
tab, select Time Series
In the Layer Time Options
dialog, I configure the Time Series to use the lightning table and the column holding the observation time. I also limit the extent to only focus on a single day, the 18th, and not the entire month of June 2002, and finally, I specify a period of 1-hour intervals.
These settings result in the map you can see below. It could seem that no lightning strikes happened during this first hour of June 18th but a closer look reveals a handful of lightning strikes near Rotterdam in the Nederlands.
Switching to the late afternoon, we can see a lot of lightning strikes happening over Denmark.
So the first task has now finished. We have created our initial map showing the lightning strikes as they appear on June 18th, 2002. In the next step, we will calculate the number of lightning strikes inside each of the hexagons.
Counting the Points inside the Polygons
This part is a bit tricky. And it's tricky in two ways.
We will be using the
statement to do the calculation. This statement can be executed from the Update Column
dialog when using a join between two tables. But now the tricky part comes into play which prevents us from using the Update Column
As I said there are two tricky things: First, we are joining our polygon dataset with the filtered layer for the points dataset. In this way, we will count the number of points for a specific time period. Secondly, we will use a dynamic approach for calculating the number of points. This will automatically recalculate the counts as the time series changes and so always reflect the number of points for the current time period.
Let's break this into a few pieces.
First, we need to find the name of the filtered table for the points layer. We can use MapBasic to get to this table name:
LayerInfo(FrontWindow(), 2, LAYER_INFO_FILTER_TABLE_ALIAS)
tells MapInfo Pro to query the active window,
specifies to use the second layer from the top ignoring the cosmetic layer and
asks for the name of the filtered table for this layer.
That's the table we want to count the number of records from.
Here is the statement we will use to create a temporary column in the hexagon layer holding the number of lightning strikes at the current moment from the point layer:
Add Column "H3_Level_3"
From LayerInfo(FrontWindow(), 2, LAYER_INFO_FILTER_TABLE_ALIAS)
Set To Count(*)
Where within Dynamic
A couple of comments to the statement:
"H3_Level_3" is the name of the polygon table
CountLightnings is the name of the temporary column added to the polygon table. You can use any column name as long as it doesn't already exist and meets the requirements of a column name (such as no spaces and doesn't start with a number).
- 2 is the number of the layer in the active map window
- You can use the number
89 instead of
- Make sure that you add the
Dynamic keyword to the statement. If not, the number will not refresh as the Time Series changes.
I have used the SQL Window
to run the statement. And to see if it works, I have turned on labels for the hexagon layer using the
column. Notice how many of the hexagons show "0". That's the locations where no lightning strikes happened at this point in time.
Let's switch to a different point in time to ensure the counts are dynamically refreshed. From the Time
tab, I select a different period in the Period
group. I can also use the Time Slider
to switch to a different point in time. As you can see the labels for the hexagon layer reflects the new counts.
Finally, I can add a theme to my map showing the number of lightning strikes inside the hexagons. I select Add Theme from the Map tab, select a Ranged Region template and specify the polygon table and the temporary column holding the counts. In this dialog, I also check the option to ignore Zeros as I don't want to include the hexagons without lightning strikes.
I specify some estimated ranges using a custom range. You will need to use a custom range if you want to be able to compare the colors across time.
I also embed the legend onto my map, make the thematic layer partly transparent and turn off the visibility of the points (the lightning strikes). Here's the final map.
Now that wasn't that hard, was it?
The Fine Print on the Contract
As we are pushing the limits of MapInfo Pro a bit to make this map, it comes with a couple of drawbacks too.
You will run into these when you save your fine map to a workspace and try to reopen this workspace. MapInfo Pro will save the workspace but you will get this error when you try to open it again.
The problem you run into is due to the order in which MapInfo Pro saves statements to a workspace. In our case, MapInfo will save the Add Column statement at the top of the workspace, before any map windows have been created. This causes the statement to fail as it is referring to the point layer.
To modify the workspace, you will need to use elements from the workspace but add these as new statements at the top of the workspace, after the
statement, and before the
statement. Below you can see the existing workspace where I have highlighted some of the statements that you can use to create a new statement.
The statements that will have to add will look like this:
Filter Where observed>=NumberToDateTime(20020618230000000) and observed<NumberToDateTime(20020619000000000)
Add Column "H3_Level_3" (CountLightnings Integer) From LayerInfo(FrontWindow(), 2, 89)
Set To Count(*) Where within Dynamic
And also remember the remove the
statement just below the
Here you can see the modified workspace that now works.
Keep in mind that you can't let MapInfo Pro overwrite this workspace. If you do, you will have to make the changes again. I would therefore recommend that you make the workspace file read-only to prevent you from overwriting the file by mistake.
Another point worth mentioning is that the more points and polygons you have in your two tables, the slower the
statement will run. If there is a way to limit the number of points, for instance, try it out. In my example, I could have cut down the input point dataset to only hold points from the specific day in question instead of having points for the entire month. The number of points also affects the performance of the Time Series in general.
And similarly, there is no reason to have polygons outside of your area of interest.
If you have any questions, feel free to post these in the comments sections below.
Peter Horsbøll Møller
Principal Presales Consultant | Distinguished Engineer
Precisely | Trust in Data