Ryan
The OGR options are at the end of the >Open Table >Files of Type list
When you select this option, it opens up a raft of other file types that you can open
I have converted national datasets to tab format this way. Geopackage performance is very poor otherwise.
------------------------------
John Ievers
CDR Group Limited
Hope Valley, United Kingdom
------------------------------
Original Message:
Sent: 03-25-2024 13:32
From: John Ievers
Subject: Ordinance Survey Data - Geopackage Problems
Hi Ryan
Ordnance Survey do pack a lot into the gpkg files. Which datasets are you using? The new NGD data has a lot of interesting data and certainly more attribute "heavy" than geometry "heavy".
I don't use the MapInfo gpkg option to open these files any more - I use the OGR option, as this performs better and has more options to save as a MapInfo table
If that helps....
------------------------------
John Ievers
CDR Group Limited
Hope Valley, United Kingdom
Original Message:
Sent: 03-14-2024 06:32
From: Ryan Cook
Subject: Ordinance Survey Data - Geopackage Problems
Hi,
We've recently got access to the osdatahub premium packages where you can download data in gpkg format. Our trouble is, the data is just so large we can't seem to work with it efficiently. Even when we choose a small area of interest, rather than the whole of GB (which is really what we need), I am encountering practically unmanageable file sizes.
- It is incredibly slow to run SQL/MapBasic queries/programs against the TAB linked to the gpkg (aka, one has simply 'opened' the gpkg in MapInfo). I have left a simple spatial query running for two days with no result. Navigating the map layer is also painfully slow.
- 'Converting' the file to native tab (aka, one opens the gpkg and then delinks it by Saving Copy As) seems to speed up navigation and queries/programs ran against it, but I'm ending up with 20G DAT files for just a county of data.
Is anybody else having difficulty with OS data?
Is there something I should be doing when handling gpkg files?
Are there any tricks to getting tables into more manageable sizes?
On that last point, even stripping the fields down takes an absolute age. In the end, it feels like something is off; we are heavy users of MapInfo and nothing on our drives comes anywhere near the 20GB file sizes or the slow running we are experiencing with the OS data. One of our client countries is Australia and we work effortlessly with the entire Australian road network which comes in at under 1GB. That a counties' worth of land sites in the UK is 20GB is peculiar. I appreciate file sizes are not necessarily related to geographic area sizes, and of course number of records and object complexity will come into it, but in the end what is the point of data tables that are too big to use?
Any advice would be appreciated.
------------------------------
Ryan Cook
ORH LTD
------------------------------