QGIS 2.8.1 – Useful Functions and Operators – Field Calculator

Calculate eastings and northings of centroid within polygon layer

Calculate area and perimeter of a polygon layer

Calculate eastings and northings of a point layer

Calculate the length of a line layer

Capitalise column values
eg upper(Sitename)
Linlithgow becomes LINLITHGOW

Camel case column values
LINLITHGOW becomes Linlithgow
LINLITHGOW LOCH becomes Linlithgow Loch

Lower case column values

Replacethis withthat in string
replace(string, replacethis, withthat)

Concatenate string a and string b
Concatenate a || b

Division and next line Multiplication

area/10,000 – divides area field by 10,000 (eg going from m2 to Hectares

Remove decimals from a field
eg 7954.235 becomes 7954 and 456525.325 becomes 456525

Functions and Operators Official Notes for Field Calculator

Posted in All, Digital Mapping, QGIS 2.8.1, QGIS General | Comments Off on QGIS 2.8.1 – Useful Functions and Operators – Field Calculator

QGIS 2.8.1 Digitising against OS Mastermap or Polygon layer using the Autotrace Plugin

The following is a set of basic instructions on how to digitise against an ordnance survey basemap using a plugin called Autotrace.

Plugins are small programs that can be loaded into your QGIS application that add functionality.

Four basic things need to be set up before starting digitsing
1) The OS mastermap needs loaded into your project that has the layer that you are going to digitise into the project.
2) Both your layer and the mastermap need to be selected
3) You need to be able to see the Digitising toolbar in a location you can easily get to and the Autotrace button is an option.
4) Snapping options need to be set.

Adding Autotrace Plugin (only required once on each QGIS install) Plugins only need to be added once – after that any new project started will include all the plugins you have ever added.

Menu navigation to bring up load plugindialog

The plugin dialog box

Setting snapping options (required once per new QGIS project) The snapping options dialog box can be accessed via menus – Settings / Snapping Options

Snapping options dialog

Snapping mode should be set to All Layers Snapping mode should be set to Vertex and Segment, Tolerance should be set to a reasonable figure (10 and pixels work well for me)

You also want to enable Topological editing and snapping to intersections as well.

Key principle with Autotrace is that while you are holding down the shift key QGIS will try and trace lines underneath the cursor.
You should now be ready to start digitising – it should be noted that there is a certain nack to digitising well – it is essentially easy but you do need a bit of instruction the nack comes from knowing when to use auto trace and when not to. Sometimes it is good to use the shift key and other times you want to break from its use.

Starting digitising
Ensure that the layer you want to use is highlighted and then press the toggle editing button.
Now hit the auto-trace button
The cursor should now change to a small purple circle and cross (was not picked up in the screenshot below) Move the cursor to a location NOT on junction between lines of polygons. As you bring the pink cross hair close you will see a black cross appear on the line this indicates that you have come within 10 pixels (or whatever your snap distance has been set to) to a node.

Starting the polygon screenshot showing the black cross

Pressing the left mouse button will start your digitising. Now you are going to hold down the shift button to start tracing along the polygon. This is where you need a bit of judgement. As you move the cursor along the lines while holding the shift key down the polygon line will be traced. Be careful on occasion you will observe the trace failing and incorrectly tracing the wrong nodes. In such a case release the shift key to get back on track.

Example of the Autotrace picking up the wrong line

When the above happens release the shift key and select a section of polygon not far from the last point. Click to snap and continue you may be able to start using the SHIFT key again to trace.

Releasing the shift key stops Autotrace from incorrectly picking the line.

When you are happy with the boundary of the polygon right click and the attribute fields will appear.

For polygons that are separate you can force AutoTrace to switch the direction of trace by pressing the CTRL key AS WELL as the SHIFT key.

If while digitising you wish to move the window across the basemap – hold the SPACE bar down and moving the mouse will shift the window according to your mouse movements.

The above technique works for digitising against other polygons within the same layer as well.

Posted in All, GIS, QGIS 2.8.1, QGIS General | Comments Off on QGIS 2.8.1 Digitising against OS Mastermap or Polygon layer using the Autotrace Plugin

QGIS 2.8.1 Getting Shape Files into SQL Server 2008 Express R2

For digital mapping the shp extention is the equivalent of csv files – A significant amount of information is still held in shape files and even if it is not, nearly every GIS package can export to shape format. It’s therefore pretty vital that you understand how to get this format into your backends.

Turns out QGIS 2.8.1 comes with a very handy excecutable called ogr2ogr.exe
On my computer this was downloaded with my installation of QGIS 2.8.1 and placed in the the following directory

C:\Program Files\QGIS Wien\bin

It looks like this executable has been a part of the the download for sometime so if you are on a different version of QGIS I suspect the file will be on your machine but in a directory other that QGIS Wien – whatever your version is.

If in doubt a simple search on ogr2ogr should give you the location.

From the command prompt you need to either navigate to the location of ogr2ogr.exe or place the full path into the instruction. In the following I have navigated to the appropriate directory using change directory command at the prompt. I then input the following.

ogr2ogr -overwrite -f MSSQLSpatial "MSSQL:server=MARK-LENOVO\SQLEXPRESS;database=WLGeodatabase;trusted_connection=yes" "C:\Users\Mark\Documents\shp\polygon\n_america.shp"

On return it will start to import the information creating a new table in your SQL Server instance within the database listed in your parameter string. It looks like it just names the table the same as the shape file, I suspect if that name already exists as a tablename in SQL Server that table will be overwritten with the new shape file. Also note that the import process can take a fair bit of time for large files so be patient. I tested it initially with a small import which it did relatively quickly, I then went and hit it with 500 thousand records and it took a little over 2 hours. Still good to know that it can cope.

Once you have imported the information into SQL you should perform some form of spatial indexing on the table.
I have noted that layers that have spatial indexing are drawn differently than non spatial indexed layers. Layers with spatial indexes are drawn in more rapidly all over the district much like a spray from a can. Non spatial indexed layers appear on screen slower as if painted from one side to the other.

Posted in All, Geographical Information Systems, QGIS 2.8.1, SQL 08 R2, SQL Server, Win 8.1 | Comments Off on QGIS 2.8.1 Getting Shape Files into SQL Server 2008 Express R2

QGIS 2.8.1 Create 20m Buffer around a Polygon and Identify Neighbouring Land

Process ;
Open project that contains the two layers you are interested in (this example planning applications and OS Mastermap)
Use the cursor to select the layer that contains the polygon you wish buffered
Hit the select button and using the mouse to select the polygon within that layer that is to be buffered
Then go to Vector / Geoprocessing Tools / Buffer
Ensure that you have the Correct input vector layer
Ensure you have Use only selected ticked
Ensure that you have the Buffer distance set at 20m
Give it a file to save to and then hit OK
The buffer polygon should be added to the project – this then needs to be selected as the planning application was selected then the spatial query button hit which in these screenshots is on the left towards the centre of the graphic
Now the important part here is that the layer that you wish to identify the polygons from is in the top drop down and the reference polygon (in this case the new buffer polygon) is in the second drop down.
Please note Where the feature should read INTERSECTS
Hit apply and QGIS should identify the selection add selection and then close.
Alter style appropriately to make look good.

Opening Project

Open the dialog to start the buffering having selected target polygon

Buffer will be added to the project (below in blue with additional layer top left)

Next save the project – the next step regularly crashed for me.
Next start Spatial Query up – button with red line half way up middle on left – ensure three main things 1) That the polygon file that you want analysed is in the top drop down 2) that the buffer polygon is in the second drop down and that having the buffer selected the selection item is highligted. You can then hit apply to start the analysis. The bar will progress across the screen (this was the part that regularly crashed for me). If it crashes re-open project and go again. It is best to not touch your computer while this analysis is being undertaken

On completion of the above – the spatial query dialog box will increase in fields and show the result of your analysis.
Press the button with the up arrow and after a short delay an additional layer will be added to the layer list (top left). In our example 54 polygons were identified as intersecting a polygon 20m from the boundary of our original application.

Here’s the finished project tidied up with the selected features added. This nicely shows the original boundary (red) the 20m buffer (light blue) and the selected OS Mastermap polygons in purple. Note in this case pavements are quite linear hence they form a somewhat web pattern all the way off screen.

Posted in All, GIS, QGIS 2.8.1 | Comments Off on QGIS 2.8.1 Create 20m Buffer around a Polygon and Identify Neighbouring Land

QGIS 2.8.1 Raster / Extraction / Clipper – Extent Quick Reference

UK National Grid – Raster Extraction Conversion guide landscape paper standard scales

Assumes that you are taking the map to the very edge of a landscape paper if you require a border you will need to use formula. Reverse calculations for portrait papers

Posted in All, Geographical Information Systems, QGIS 2.8.1, QGIS General | Comments Off on QGIS 2.8.1 Raster / Extraction / Clipper – Extent Quick Reference

QGIS 2.8.1 Creating a Junction table between Polygons and Addresses (National Street Gazetteer)

Lets imagine that we have two geometric recordsets in a database these sets need to be related but we are struggling to relate them using any form of composite index (eg postal addresses fields in both recordsets). Both recordsets do however have geometry AND we know three things 1) the geometrys in principle overlap 2) that one geometry consists of defined polygons and 3) that the other geometry consists of defined points.

In the following example I have West Lothian’s section of the national street gazeteer that I am trying to relate to polygons to create a junction table showing the PKIDs of the polygon related to the UPRNs of the national street gazeteer.

Please note all your systems should be using UPRNs to give anything with an address a geometry!!

So why might you want to do this?
This might not look like it but this is an excercise in back filling a table in an enterprise application.
If you have a table in a system that you know has a polygon geometry and you wish to relate that table to addresses you will need to do a spatial filter. Addresses still tend to be messy and using UPRNs reduces a significant amount of the mess that can relate to matching addresses. If your polygon file is unique this junction table will be completely unique to your operation and so if possible you want to store it in a junction table in your database. Hence the junction table relating the PKIDs of the polygons to the UPRNs. By performing join attributes by location you are able to automatically create a junction table that adds the PKID of the parent polygon as an extra column next to the UPRNs.

Working with larger recordsets something to look out for.
Once you have done the join you may want to strip details out of the attribute table (you only really need the PKID field and the UPRN field). My computer (and possibly yours) had difficulty with this in QGIS, it was locking up during deletion and save. I resorted to asking it to delete the fields and then going away for lunch but another way might be to get the National Street Gazeteer text file BEFORE import into you QGIS project to strip out any surplus columns. From my experience of editing 100k records setting deletion of columns took about 15 minutes. Fine if you are prepared for it but something to watch out for.

The opening project

Menu selection

This dialogue does not relate to what layer you have highlighted so can be started whenever you are in the project. Ensure the file you want to receive the attribute information is placed at the top (the red arrow).

The Junction file which has PKID and UPRN in the attribute table

Posted in All, Data Cleansing, Dev Ops, Digital Mapping, GIS, QGIS 2.8.1 | Comments Off on QGIS 2.8.1 Creating a Junction table between Polygons and Addresses (National Street Gazetteer)

QGIS 2.8.1 Joining a text file to a Polygon file based on an attribute

Ok so not the most ground breaking task in the world but something that most users of digital mapping systems will regularly need to do. Here’s my notes

Process Flow
1.Identify the text file you are looking to join and note down the field name that you will be using as your join.
2.Identify the polygon file you are looking to join the text file to and note down the name of matching field to be used in the join.
3. Import text file using the comma icon to the left
4. Select layer to which you are wishing to join your new text file right click and select properties – within left panel select joins and then hit plus button at bottom
5. Identify table to be joined to your target layer and ensure that correct fields are matched.
6. Apply join

More detailed description below

3. Importing the text file via the create layer windows dialogue

Alter the Layer Name – default will be the filename but better to give a short and to the point description
Indicate in section called geometry definition if the file has or has not geometry
Hit OK and the file will be imported into your project.

4 and 5 – Selecting the target layer and Match files and fields to join.
You should have identified in stage 1 what fields you are going to join but if you havent view the attributes of both the imported text and the polygon layer and identify the fields that you will be joining. Its interesting I liked the way this was done in ArcView 3.3 as you would naturally open both tables highlight the fields and then press join. A very intuitive way. Don’t get me wrong its easy peasy in QGIS but not quite so intuitive.

Highlight the layer you are interested in and right click to get menu and then select properties. The layer properties window dialogue should appear similar to this :

First off make sure in the left pane Joins is highlighted – then hit the green PLUS button at the bottom left of the dialogue and when the Add Vector join dialogue appear select the drop down and select the text ( or polygon ) file you are interested in.


Remembering the fields you are interested in you can match for join – Arrows 1 and 2 below hit ok then apply and you should be left with your finished join.

6. Join Applied
Here’s a picture showing my join with the polygon attributes to the left and the text field attributes to the right

Practice and repeat until you are familiar. This is very much a basic skill in QGIS may form the starting procedure in many projects.

Posted in All, Digital Mapping, GIS, QGIS 2.8.1, QGIS General | Comments Off on QGIS 2.8.1 Joining a text file to a Polygon file based on an attribute

QGIS 2.8.1 now with ability to create SQL Spatial Indices

A work colleague noticed that it is now possible to create Spatial Indices on SQL tables through QGIS 2.8.1. Useful if you need to speed up a layer but you have no access to SQL Server instance. From approximate testing it would appear that if an index already exists then the spatial indices creation will fail.

This can still be done through SQL Server Management Studio if needs be see post here



Posted in All, Digital Mapping, Geographical Information Systems, GIS, QGIS 2.8.1, QGIS General, SQL Server | Comments Off on QGIS 2.8.1 now with ability to create SQL Spatial Indices

Function to allow Parsing of Combined Fields

Here’s a nice function that will allow you to split up any combined field based on a given deliminator.
As example let us give a string that we might wish to investigate.

The string
Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge / West Lothian , United Kingdom

Public Function ParseDelimString(StrField As String, NumPos As Long, Optional StrDeliminator = ",")

Dim StrColumn() As String

StrColumn() = Split(StrField, StrDeliminator)
If UBound(StrColumn) < NumPos Then ParseDelimString = "" Else ParseDelimString = StrColumn(NumPos) End If End Function

And Examples of result from entering differing parameters
StrField : Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge / West Lothian , United Kingdom
Function : ParseDelimString([Description],0,'/')
Result : Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge

StrField : Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge / West Lothian , United Kingdom
Function : ParseDelimString([Description],1,'/')
Result : West Lothian , United Kingdom

If you do not specify the optional deliminator it will assume that it is a comma.
StrField : Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge / West Lothian , United Kingdom
Function : ParseDelimString([Description],1)
Result : Longridge / West Lothian

Setting the number position to 0 gives you everything before the first instance of the deliminator from the left.
StrField : Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge / West Lothian , United Kingdom
Function : ParseDelimString([Description],0)
Result : Site (51 Houses) (Carvill) to the west of Curling Pond Lane

Posted in All, Data Cleansing, Database Design, VBA Code MS Access | Comments Off on Function to allow Parsing of Combined Fields

Using SQL to parse, clean and format strings

Many datasets can be somewhet confused by the time you get them. Maybe you had no control of the export from the database or maybe you asked for the right information and it came back somewhat warped.

SQL has powerful fuctions that can pretty much clean things up however you would like.

We can use multiple SQL commands within an MS Access module to clean up a source by placing them consecutively within a module here is the structure of some of the queries that I use.

SQL that updates Town and PostalTown fields based on a string in an aggregated PostalAdd field.
Please note that _ sign denotes a movement to another line within the VB Module required to make the SQL String run correctly. This may require alteration if you are cutting and pasting from this page.

Dim SQL As String
SQL = "UPDATE Table01 SET Table01.Town = 'Barassie', Table01.PostalTown = 'TROON' " & _
"WHERE (((Table01.PostalAdd) Like '*Barassie, TROON*'));"

The following SQL converts the street field of Table01 to all capitals. This could be run like the previous SQL from within an MS Access module

Dim SQL1 As String
SQL1 = "UPDATE Table01 SET Table01.Street = StrConv([Table01].[Street],1);"

In the above code change the trailing number parameter to select type of alteration
2 – all lower case
3 – Camel Case

This looks to the PostalAddress field of Table01 and searches from the left for a comma and returns everything to the left into a field called LeftParse

Dim SQL2 As String
SQL2 = "SELECT Left$([PostalAddress],InStr(1,[PostalAddress],",")-1) AS LeftParse FROM Table01;"

If you have a string with commas this string will count the length of the string then count the number of characters to your unique string – in this case a comma – and then return all characters from that string to the right of that comma.

The below code looks to the PostalAddressField of Table01 counts its length and then find the first comma from the right and returns the information as a select query result in a field named Right Parse. It should be noted that it searches through the target field searching from the left. IF there are multiple commas then it will stop counting when it hits the first comma. You can substitute the right part of the function with a number.

Dim SQL3 As String
SQL3 = "SELECT Right$([PostalAddress],Len([PostalAddress])-InStr(1,[PostalAddress],",")-1) AS RightParse FROM Table01;"

Posted in All, Data Cleansing, SQL MS Access, VBA Code MS Access | Comments Off on Using SQL to parse, clean and format strings

SQL Saturday #388 Edinburgh

Went along to SQL Saturday BI Edition on 13th of June.
Honestly I thought it was really great. Many thanks to Jenny Stirrup for being the event organiser. If you are thinking about going next year you really should.

This was the schedule for SQL Saturday in Edinburgh

Sessions I attended
Key Note with Jon Woodward – on long term future of computing in general
Web Scraping with Python – Iain Elder of Sky Scanner
Better Data Visualisation for CRM and ERP – Adam Vero
The European Economic Crisis and the Euro-A Data Tale – Carmel Gunn
Blue Sky Thinking : SQL Azure Geospatial Mashup – Thomas Sykes
Master Data Management – Dave Lawrence

All the talks were really thought provoking and nice to hear from people who are really at the top of the game. I have already started experimenting with Web Scraping and Python.

Posted in All, Continual Professional Development, Database Design, Dev Ops | Comments Off on SQL Saturday #388 Edinburgh

QGIS 2.8.1 Updating Copyright Label in your QGIS Projects

Specific example QGIS 2.8.1 probably same in other versions.

Ensure you have copyright quote correct on QGIS projects

View / Decorations / Copyright Label

Posted in All, Configuration, QGIS 2.8.1, QGIS General | Comments Off on QGIS 2.8.1 Updating Copyright Label in your QGIS Projects

Connecting to PostgreSQL 9.3 from QGIS 2.8.1 – local host

First ensure that you have both Postgres and QGIS installed on your machine.

In order for you to be able to connect to Postgres from QGIS on local host you must ensure 2 things. Firstly that the PostGIS plugin has been installed on your laptop AND secondly that you have included the postgis extension in each database that you wish to connect to. Without enabling the extension in the database you won’t be able to connect OR import shape files. Installation of PostGIS is often a default during the install of postgres but you can check whether this was completed correctly by using the Application Stack Builder, a small program that is installed with later versions of postgres.

I navigated to this on the win 8.1 machine I was using by using search.

Opening application stack builder you will be presented with the following.


Expand the spatial extensions tree to identify if you already have the PostGIS plugin installed – if not – select as appropriate the plugin and you will be prompted to install. You will need an internet connection for this. Above you can see that my plugin was already installed.

Next you need to add the PostGIS extension to each Postgres database you wish to link to from QGIS this is done through PG Admin.

This is something that both myself and a colleague got caught out by and it took me an hour of searching to find how to fix it.

Below I have a database called GISData which I have just created. You will note there is only one object within the expanded extensions tree. You will not be able to connect to a database that does not include PostGIS in its list of extensions

Hi-light the database you want to spatially enable then go to Tools – Query Tool( Ctrl + E will do the same). In the above picture I’ve slightly jumped the gun. To add the extension to the database type.


Run the query by selecting the green right arrow
There will be a short delay and then upon refresh of the connection postgis should appear in the list of extensions.


You can now close the Postgres administrator and return to QGIS where you should be able to setup the connection to the database.

Parameters should be similar to below and it is useful to test the connection prior to saving.


Posted in All, Geographical Information Systems, GIS, PostGIS, Postgres, QGIS General | Comments Off on Connecting to PostgreSQL 9.3 from QGIS 2.8.1 – local host

Optimising Performance between QGIS 2.8.1 and SQL Server Express 08R2

I have just been through trying to optimise the speed of connection between SQL Server and QGIS and I have found two ways of speeding up the connection.

1) Create a geometry_columns table for the database which speeds up reading in the tables to the selection screen
2) Create spatial indices on individual tables which vastly increases the speed of draw of layers within QGIS

1) Create geometry_columns table on the database
When connecting SQL Server to QGIS if you don’t implement a geometry_columns table QGIS will slow down as it scans each table within the SQL Server connection to find the type (point polygon etc) and the SRID projection.

TSQL to create the geometry_columns table

CREATE TABLE [dbo].[geometry_columns](
[f_table_catalog] [varchar](128) NOT NULL,
[f_table_schema] [varchar](128) NOT NULL,
[f_table_name] [varchar](256) NOT NULL,
[f_geometry_column] [varchar](256) NOT NULL,
[coord_dimension] [int] NOT NULL,
[srid] [int] NOT NULL,
[geometry_type] [varchar](30) NOT NULL,
[f_table_catalog] ASC,
[f_table_schema] ASC,
[f_table_name] ASC,
[f_geometry_column] ASC

next for each table you will need to fill our the geometry_columns table as follows;

Here’s an example where I have a database called Corporate Address Gazeteer within which I have a table of planning applications (T001). Within that table the geometry is held in the Coordinates field


All tables within the connection (probably one database) will need to be recorded in this table. Provided the ‘Only look in the geometry_columns metadata table’ option is saved with the connection when connecting to a saved connection QGIS will only show those layers listed in the geometry_columns table. Below is the check box where QGIS is told only to look in the geometry_columns table (note server and database names are imaginary)


2) Next speed up the draw on screen of SQL Server layers by creating Spatial Indices on each of the tables and ensuring that the database as a whole has a ‘spatial_ref_sys’ table.

TSQL to create ‘spatial_ref_sys’ table

CREATE TABLE [dbo].[spatial_ref_sys](
[srid] [int] NOT NULL,
[auth_name] [varchar](256) NULL,
[auth_srid] [int] NULL,
[srtext] [varchar](2048) NULL,
[proj4text] [varchar](2048) NULL

Next run the following TSQL on the ‘spatial_ref_sys’ table

INSERT into spatial_ref_sys (srid, auth_name, auth_srid, proj4text, srtext) values ( 27700, 'epsg', 27700, '+proj=tmerc +lat_0=49 +lon_0=-2 +k=0.9996012717 +x_0=400000 +y_0=-100000 +ellps=airy +datum=OSGB36 +units=m +no_defs ', 'PROJCS["OSGB 1936 / British National Grid",GEOGCS["OSGB 1936",DATUM["OSGB_1936",SPHEROID["Airy 1830",6377563.396,299.3249646,AUTHORITY["EPSG","7001"]],AUTHORITY["EPSG","6277"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],

This was obtained from EPSG link below – swap out your SRID if you are not from the UK
(Warning : Just be careful with pure cut and paste of above HTML. The code tag in wordpress is struggling to cope with all the brackets and commas and formatting is a bit off here best to cut and paste from below link)


The great thing about running this is that you can place the projection in the sys table and users will not then need to select a new projection everytime they add a new layer an otherwise repetitive task. For the UK we only require one record within this table 27700 – if you have multiple projections you will need an entry for each individual projection.
I have slightly altered the reference obtained from the above link as it had placed a 9 infront of the first SRID reference – deleting it stopped the projection appearing for each user I have changed it in the above TSQL so it is correct but the graphic below still shows the 9 in front of the 27700.


For each table we need to create spatial indices.

Creating a spatial index on a table.

Go to the table in question and expand the table structure so that you can see the index on the table right click on index and select – New Index

You should be presented with the following screen
It should be noted that you choose Spatial for the type of index and in order for you to be allowed to do this the table should already have a clustered index field (in the picture example this is already in existance) and you will need to know the geometry column of the table in question. You give the spatial index and attach it to the geometry field in the table. In the graphic this is Coordinates


Next you need to give bounding box to the spatial index. The below figures are those used for West Lothian in this case
min x = 285380
min y = 651170

max x = 314890
max y = 680340


You can now press OK and the spatial index should be added to the table

If you have done it right when you expand the index field it should look something like this


Posted in All, Configuration, QGIS General, SQL 08 R2, SQL Server, SSMS | Comments Off on Optimising Performance between QGIS 2.8.1 and SQL Server Express 08R2

Setting up a Blank SQL Server Spatially enabled Table using Microsoft SQL Server Management Studio 2008R2 Express and displaying it in QGIS 2.8.1

Programs used;

1- SQL Server 2008R2 Express
2- SQL Server Management Studio 2008R2 Express

The example uses UK national grids coordinates to create a Triangle Polygon in a SQL Server Table

I’ve previously written that while we’ve had spatially enabled SQL Server for over 5 years I constantly come across line of business applications that although using SQL Server have not and do not intend to spatially enable the application. This is undoubtedly because of the difficulty in re-designing legacy systems actively in use and because the benefits although significant are not generally requested by all but the most knowledgable of colleagues.

While I understand this legacy system reasoning spatially enabled databases are the future so its just a matter of when and not if an application will require alteration. Understanding it in this context makes it really a requirement to start seriously planning for its inclusion.

Developerers creating new applications however should always consider spatially enabling relevant tables from the start even if it is not specked by the client/colleague. It being so much easier to spend a couple of minutes future proofing the schema of a new born database rather than hours trying to retrofit a live in production back end.

Firstly it’s important to understand what a geodatabase in SQL Server actually is.
Really it is a normal database which has one table that has a field that has a geometry or geography value type. In this example I will use desktop QGIS 2.8.1 to display the resulting geometry but any other digital mapping package that can link to SQL Server could be used. SQL Server also has a very rudimentary Mapping Display but you will need something better if you want to manipulate boundaries visually.

Many digital mapping products have plugins that will create Geodatabases and tables however I haven’t seen one for QGIS. I really wanted to be able to create spatial SQL tables on my own without recourse to paid tools directly in SQL Server Management Studio. So here’s my method of creating blank polygon table whose geometry is ready to be read and edited in QGIS or any other digital mapping system just using SQL Server Management Studio Express 08R2.

1. Create a new Table
2. Ensure the table has an identity Key that increments
3. Create a geometry column
4. Write a query that updates the geometry column

UPDATE T001Sites SET Coordinates=geometry::STGeomFromText(‘POLYGON((301804 675762,295789 663732,309581 664870,301804 675762))’,27700)

You will note that there are four coordinates here (each coordinate being a pair of numbers )
The first coordinate and last are the same this is required by SQL to ensure that the polygon is closed.

The 27700 number is the Spatial Reference System Identifier (SRID) – it is a unique value used to unambiguosly identify projecttion. 27700 is the unique identifier for the United Kingdom National Grid and represents the coordinates my example refer to. The spatial reference identification system is defined by the European Petroleum Survey Group (EPSG) standard which is a set of standards developmed for cartography surveying etc and owned by the Oil and Gas Producers Group list here; http://www.epsg-registry.org/

The above coordinates display a triangle in West Lothian near Edinburgh

5. Set up the connection to SQL Server Instance

Ensure the box marked “Only look in the geometry_columns metadata table” checkbox is unchecked. By default this is checked and if the geometry_columns table does not exist you will get an error message.


6. Display the table and edit as appropriate.

Select the table then hit the Add button


And here is the SQL Server table in QGIS ready to be added to or edited.

Posted in All, Configuration, Database Design, Dev Ops, QGIS General, SQL 08 R2, SSMS | Comments Off on Setting up a Blank SQL Server Spatially enabled Table using Microsoft SQL Server Management Studio 2008R2 Express and displaying it in QGIS 2.8.1

Spatially Enabling an existing Table – which has Eastings and Northing fields UK National Grid

The following TSQL will spatially enable a SQL Server Table called T1PlanApp when it has two Fields called East and North with UK National Grid References.

UPDATE T1PlanApp SET Coordinates=geometry::Point([East],[North],27700)

Posted in All, Database Design, Dev Ops, QGIS General, SQL 08 R2, SQL Server, SSMS | Comments Off on Spatially Enabling an existing Table – which has Eastings and Northing fields UK National Grid

Administering a Web Application some thoughts

In the last couple of years I have started administering a bought in Web Application from an outside vendor. I have found this really quite educational. I will not reveal the name of the product the vendor seems neither better nor worse than other vendors and many of the issues I am experiencing I see repeated with other admins of other web applications.

The software consists of various pieces of middleware linking SQL Server and the .Net and Javascript files. It uses a fairly standard Microsoft Stack to deliver a single page application. The big selling point was that that it was widely available through browsers pretty much anywhere on the web.

We have had it up and running in anger for a couple of years now and I have about 75 users all internal.

The vendor has created a new version and we now need to update. In discussion with the company they tell me that I should not be doing this myself and it is standard practice to get them in to upgrade the version. We already use it more as a stack technology to build our own applications I would want to continue this in the future. They have indicated that it will cost money to get them in to upgrade the software. This is triggering alarm bells because the existing configuration is very complicated, errors are quite regular and error messages tend to be pretty cryptic or non existent. I also like doing setup myself as it teaches me the ins and outs and speeds my learning up for systems which have non-trivial configuration.
A project board has been set up senior management are on board and well its grown legs arms and tails.

We have not widely publicised the application to the general public and they do not access the application at present.

The application itself requires significant amounts of configuration to set up correctly and parameters tend to be static and buried deep in very large HTML files. All this leads me to conclude that configuration in the new version is likely to be just as awkward. The company’s help desk is very good and there is extensive documentation but the sheer obscurity and quantity of configuration parameters makes the documentation the size of a phone directory, difficult and opaque.

Here’s some questions I should remind myself of when looking at New Web Applications in the future.
Are staff really so distributed that you require wide distribution?
Is it an internal or external distribution? – If not external do you really need a web enabled application.
Do you have internal expertise that is interested in developing applications?
With Web Applications or Local applications the backend database technology could well be the same. Internal local area development is likely to be far more flexible / cheap / quick and could motivate staff additionally it can be far more rich in its scope. It would be possible for instance to develop rich interface internally and a minimal interface externally using something like Lightswitch. This would take advantage of the strengths of both development environments.

There is a big push towards web applications I just think IT managers need to carefully consider their positions these applications tend not to be cheap and have a number of disadvantages that seem to have been pushed out of sight.

Posted in All, Configuration, Continual Professional Development, Dev Ops, General Opinion | Comments Off on Administering a Web Application some thoughts

Database Normalisation – a review of the rules to refresh my knowledge

Here is a review of the 6 forms of normalisation (yes the 4th is called 3.5 for some reason) – there are lots of summaries on the web of these, but I wanted to re-write them in my own words to ensure I understand them fully. Revision of these post real world experience is particularly enlightening.

Normalisation rules are cumulative ie a database or set of tables can be said to be 1 through to 5 compliant. To rank at the highest level it must comply with all previous forms.

That is to say that a database (or table) to be in the 2nd Normal Form must first fulfill all the criteria of the 1st Normal Form.

A database is said to be properly normalised if the first three normalised forms are inviolate.

Violations of normalisation forms subsequent to the third normal form may be legitimately required and as such are NOT absolutely required for a database to be classified as properly normalised.

Some web explanations do not list forms post 3 I would argue there is still a lot of value in trying to adhere to these less well forms but must admit to not always including them in design usually due to poor knowledge of the application domain.

The differing forms of normalisation start simple becoming increasingly complicated.
If you have a lot of experience in creating relational backends you may have deduced these issues for yourself.

(1NF) First Normal Form – Edgar F Codd – first stated 1970
Eliminate duplicate columns from the same table.
Create separate tables for each group of related data and identify each row with a unique column or set of columns (the primary key).

(2NF) Second Normal Form – Edgar F Codd – first stated 1971
Create relationships between these new tables and their predecessors through the use of foreign keys.

(3NF) Third Normal Form – Edgar F Codd – first stated 1971
Remove columns that are not dependent on the primary key.

My example take on this is that calculated columns should NOT be included in tables

So here imagine you have an invoices table with the following
InvoiceNo (PKID)
CustomerNo (FKID)

So according to the first and second normalisation forms there are no duplicated forms and the FKID customer no shows that there are relationships. However a Total figure can be calculated by multiplying unit price by quantity and as such violates the 3rd Normal Form.

The new table would resemble the following
InvoiceNo (PKID)
CustomerNo (FKID)

Total is not stored and would be calculated on the form. It is noted that sometimes you wish to store things like titles because for instance unit prices may vary over time and you don’t record want to loose the overall total by updating unit price – especially unit price is referenced from another table. A similar argument relates to things like VAT which can be changed on a yearly basis. In such a case I would argue that you still don’t break the 3NF as the value of column is not dependent on the primary key it is dependent on the value of the VAT rate at any given time.

The 3.5 Normal Form or the Boyce Codd Normal Form
This is a slightly stronger version of the third normal form developed in 1974 by Raymond F. Boyce and Edgar F. Codd to address certain types of anomaly not dealt with by the 3rd Normal form as originally defined.

Only in rare cases does a 3rd Normal Form table not meet the requirements of Boyd Codd Normal Form. A 3rd Normal Form table which does not have multiple overlapping candidate keys is guaranteed to be in Boyce Codd Normal Form.

Web investigation indicates that the most concise description is;
Every determinant must be a candidate key

and that Boyd Codd Normalised Form is not always possible

Nearest Shops
Person Shop Type Nearest Shop
Mark Jewellers H Samuel
Mark Barbers Mr Man
Natalia Coffee Shop Starbucks
John Coffee Shop Costa
John Barbers Mr Man
John Jewellers H Samuel

The issue above is that Mr Man on record 2 could be changed to a Hairdressers for shop type but the second last record John would not change and as such it would seem to be inconsistent. In this case I would split this table into two with Nearest Shop and Shop Type being linked and there being some kind of table that links People to Nearest shops.

The 4th Normal Form
A table is in the 4th Normal Form if it has no multi-valued dependencies.
Definition: Multivalued dependencies occur when the presence of one or more rows in a table implies the presence of one or more other rows in that same table.

For example lets say you have a you have a shop selling coffee – lets call it Starbucks and you have a table that lists the permutations of cofee sold in its store
CoffeeShop Coffee Product Location
Starbucks Café Late Livingston
Starbucks Americano Livingston
Starbucks Café Late Edinburgh
Starbucks Americano Edinburgh

If we know that all Starbucks sell the same products if there is a new product in one shop then we would expect that new product to be in the other shops so adding a Skinny Late to the Livingston shop implies that there should be another record for Skinny Lates in the Edinburgh shop.

To eliminate the possibility of anomalies we must place the facts about the products and the location of the shops into two tables

Starbucks Café Late
Starbucks Americano
Starbucks Skinny Late

Location of Coffee shops
Starbucks Livingston
Starbucks Edinburgh

Alternatively if the coffee types did vary from location to location the original table would satisfy 4NF

The 5th Normal Form
A table is in fifth normal form or Project-Join Normal Form if it is in the Fourth Normal Form and it cannot have a loss-less decomposition into any number of smaller tables.

Or as I like to call it the – I know about it and try to adhere to it but sometimes I’m a bit lazy form.

I certainly try to keep my designs to the full six normalisation principles but there are instances where certain tables do not comply.

Posted in All, Continual Professional Development, Database Design, My UI Design Patterns, Personal Development | Comments Off on Database Normalisation – a review of the rules to refresh my knowledge

AutoHotKey : Navigation between Satellite Applications to improve Work Flow

A while back I wrote a post about how allowing parameters to be passed to URLs is a big benefit in increasing the speed with which you can navigate to individual records in apparently non-connected web applications.

But what do you do if you are faced with a satellite application whose vendor has not implemented this URL friendly facility. Users are left with the very jarring break to the flow of their work when they have to leave the application they are in and navigate to another application sometimes manually having to link to the other application records form via a search field. This searching task when multiplied many times can be really tedious, repetitive, demotivating and time consuming not to mention pointless.

How can we better serve our users?

The other day I came across an open source program called AutoHotKey that allows me to improve this task.


Autohotkey is an open source project that allows the creation and compilation of simple or complicated scripts that can be used to navigate anything on a computer. That means desktop OR web applications. The following is something that I worked out last week to be able to navigate a web application by triggering a script from MS Access vba. The great thing is that you can pass parameters from a database application to a middle layer and trigger a set of commands to be run.

Let us take the example of a recent problem I faced. Many councils throughout the United Kingdom have bought an application from a company that manages the information associated with making planning applications, it consists of both desktop and web applications that help manage the submission and decision making associated with development. The vendor recently “upgraded” the application resulting in it no longer accepting planning application numbers to its URL as a method of going straight to the record. This was meaning that users of one of my satellite applications were faced with being dropped into a search screen and then needing to manually type a field from one application into the field of another application. QED dull and repetitive task.

There follows and overview of my solution. Firstly download the following programs


2)iWB2 Learner – which is a small program for identifying element names and id in INTERNET explorer.
iWB2 Learner
iwebbrowser2 Download

My script for Autohotkey was as follows.

FindPlanningApplication.ahk (written in plain old notepad and saved to a known location with the suffix changed to ahk)

URL := "https://planning.westlothian.gov.uk/online-applications/"

WB := ComObjCreate("InternetExplorer.Application")
WB.Visible := True
While wb.readyState != 4 || wb.document.readyState != "complete" || wb.busy ; wait for page to open
Sleep, 10
wb.document.getElementById("simpleSearchString").value := Application
While wb.readyState != 4 || wb.document.readyState != "complete" || wb.busy
Sleep, 10


Using iWB2 Learner to identify the element names on the web page
This video shows iWB2 Learner being used it unfortunately does not have any sound.

VIDEO Using iWB Learner with AutoHotKey

Next you will need to trigger the AHK – You will need design access to the program that is sending the instruction to do this. In my MS Access application I have the following code that triggers the script in the above.

Private Sub Command43GoToApplicationOnlineRecord_Click()

Dim strPlanApp As String
Dim strAHKname As String

strPlanApp = "LIVE/" & Me.ApplicationNo

strAHKname = "\\server1-cluster\ahk\FindPlanningApplication.exe"
Call Shell(strAHKname & " " & strPlanApp, vbMaximizedFocus)

End Sub

The computer that holds the AHK script need not have AutoHotKey installed if it doesn’t you can compile your script into an executable that will not require installation. Here I created the executable on another computer and transferred it to the \\server1-cluster\ahk location ready to be called by the VBA

Consecutive parameters passed to Autohokey are consecutively named %1% %2% etc.. In my script I pass the planning application as %1% and rename it APPLICATION immediately.

Compiling the AHK is done by moving to a computer with AHK installed and navigating in Explorer to the file and then right click and Compile will be an option. Note the processor architecture is important when compiling. If your target machine is 32bit then you need to compile on a 32bit machine – same with 64.

Posted in All, AutoHotKey, Productivity, Programming, VBA Code MS Access | Comments Off on AutoHotKey : Navigation between Satellite Applications to improve Work Flow

VBA function to Pivot and Concatenate Child records

In situations where a Parent record has a limited number of children (0 to 10 works well) and you would like to list those children next to the parent somewhat like you would with a Pivoted table. Pivoting the table would however result in a massive table which is extremely wide. Pivoting and concatenating the fields can keep the resulting list within a manageable width. Take for example the following table.


So we have table with four header records and the children all relate to those parents. For each parent obtain a list of children

Firstly place the following function in a module

Public Function Conc(Fieldx, Identity, Value, Source) As Variant
Dim cnn As ADODB.Connection
Dim rs As ADODB.Recordset
Dim SQL As String
Dim vFld As Variant

Set cnn = CurrentProject.Connection
Set rs = New ADODB.Recordset
vFld = Null

SQL = "SELECT [" & Fieldx & "] as Fld" & _
" FROM [" & Source & "]" & _
" WHERE [" & Identity & "]=" & Value

' open recordset.
rs.Open SQL, cnn, adOpenForwardOnly, adLockReadOnly

' concatenate the field.
Do While Not rs.EOF
If Not IsNull(rs!Fld) Then
vFld = vFld & ", " & rs!Fld
End If
' remove leading comma and space.
vFld = Mid(vFld, 3)

Set cnn = Nothing
Set rs = Nothing

' return concatenated string.
Conc = vFld
End Function

Now set up a Query and call the function in an expression


This results in the appropriate list


Posted in All, VBA Code MS Access | Comments Off on VBA function to Pivot and Concatenate Child records