I had been using the 2010 WordPress standard theme as the basis for Round Up the Usual Suspects but decided it was time to upgrade. I decided to go for the 2016 WordPress standard theme as it is so well tested and with a relatively large back catalogue my primary concern was that I could port everything forward as easily as possible. I will be working on making it as user friendly as possible.
Objective here is to write a series of queries that can be used to measure the shortest distance between selected paired locations on a network such that the geometry of the routes can be calculated and displayed on a map.
For this particular tutorial you will need – QGIS 3 or higher and a version of Postgres I am using version 11.0 here (I have upgraded since my former posts). I believe this tutorial will work with previous versions but if you are following along now might be a good time to upgrade.
QGIS 3.4 or higher – needed as the Ordnance Survey road network geometry contains a z coordinate which will prevent the creation of the required geometry for measurement. QGIS 3 introduced the ability to save geometry excluding z coordinate. If you have a network without z coordinates you should not require this.
So let us first get the data. Here you tick the option in the top right hand corner – scroll to the bottom and submit your request after which you will be asked a few basic questions along with email address you wish the download to be sent to after a few minutes you should be sent the download link through your email – follow the instructions and you should be able to get the information
The information you are downloading is a block framework for the whole of the uk. When you unzip the download into a folder you will see multiple files. We will be using a section of the national dataset relating to Edinburgh – NT. Choose the block or selection that you are interested in. More blocks may take more time however.
Create a new project : eg EdinburghRouting.qgz
Load in your chosen network block : eg NT_RoadLink.shp
Select the layer you just loaded in : eg NT_RoadLink.shp
and navigate to the following in the menu settings
Layer / Save As
Fill out the Save Vector Layer as … dialog box
IMPORTANT – ensure within the Geometry section
Geometry type is set to LineString
Include z-dimension is unticked
Give the new file a name : eg ntosroutingnetwork.shp
Within the layer dialog of QGIS your new layer should appear you can now remove the for NT_RoadLink shape file from the project
Next go to your version of PostgreSQL and using a superuser account create a new database : eg edinburghrouting
I would suggest you use lower casing as well
As a superuser ensure you add the postgis and pgrouting extensions.
Next I set up the following connection between the QGIS project and PostgreSQL
Personal tastes may vary but I like like to select
Also list tables with no geometry
Allow saving/loading QGIS projects in the database
OK the selection and you should now have a connection to the new database you just created.
QGIS has an excellent dbmanager window which we will use to load our new shape file which excludes the z layer into the new database we created in PostgreSQL
Ensuring that you have a connection to your localpostgis database hit the
Here I load the information into a new table t01roadnetwork
On pressing OK there will be delay after which if things go well you will receive the following message.
As ever it is good to check that things appear to be going well.
Add the layer to your project and determine approximately whether import was successful.
Next back in psql command line and in an editor we are going to run 4 queries
The first 2 add columns that are required in the shortest distance algorithm we shall use, the third will allow anyone to write an aggregation function to see the total cost of the route and the last creates a topology for the road network.
alter table public.t01roadnetwork add column source integer; alter table public.t01roadnetwork add column target integer; alter table public.t01roadnetwork add column agg smallint default 1; select pgr_createTopology('public.t01roadnetwork', 0.0001, 'geom', 'id');
If things go correctly you should see the database engine start to create the topology and what I see is it gradually stepping through the creation process.
and on completion you should have something like the following:
A new table has been added to the edinburghrouting database and next step is to display the network and its vertices. In QGIS.
In QGIS we should see something like
The next thing that I like to do is to label the nodes so that for quick identification.
And look to the t01roadnetwork table and see if the columns are clear and present.
We are now ready to make a measurement. Here I choose the nodes 15883 and 10967
SELECT seq, id1 AS node, id2 AS edge, cost, geom , agg FROM pgr_dijkstra( 'SELECT id, source, target, st_length(geom) as cost FROM public.t01roadnetwork', 15883, 10967, false, false ) as di JOIN public.t01roadnetwork pt ON di.id2 = pt.id ;
Now we can load this as a new layer and then improve the symbology
Doing this we get.
It should be noted that the line you see is a collection of lines. In my next post I will go through and indicate how we can amalgamate that into a single line for storage in a table.
Congratulations if you have got this far you should be able to measure the shortest distance between any two points on a valid network by altering the numbers.
Given that on creation of a function it is a requirement to state the procedural language of calculation what happens if we try using Python?
Just so happens the calculation part of Python for Fahrenheit to Celsius syntatically is identical between SQL and Python sooo lets try this.
CREATE FUNCTION calcfp(integer) RETURNS integer AS 'SELECT ($1 - 32) * 5/9' LANGUAGE plpythonu RETURNS NULL ON NULL INPUT;
Yes they couldn’t just call python python it has to be plythonu.
This is what I got.So assuming I referred to Python correctly it would appear that I need to enable the use of python in postgres.
Given that I don’t know Python I am not going to install this as an extension but should you wish the following will be relevant;
It has the following hightlighted instruction
Note: As of PostgreSQL 9.1, most procedural languages have been made into “extensions”, and should therefore be installed with CREATE EXTENSION not CREATE LANGUAGE. Direct use of CREATE LANGUAGE should now be confined to extension installation scripts. If you have a “bare” language in your database, perhaps as a result of an upgrade, you can convert it to an extension using CREATE EXTENSION langname FROM unpackage
As per 007 lets try and create our own function to calculate centigrade from fahrenheit and call it calcf
CREATE FUNCTION calcf(integer) RETURNS integer AS 'SELECT ($1 - 32) * 5/9' LANGUAGE SQL RETURNS NULL ON NULL INPUT;
Next lets see if I can write the same function using Python.
As per the nature of recursion a function is a variable is a function.
Previously in 005 and 006 we wrote functions that returned subsets of queries they were effectively dynamic queries where I entered a parameter that was used in a select query. This effectively meant that although the function was returning a variable this was a query of a select statement.
What if we wish just to return a single value as in say translate centigrade to fahrenheit or some other calcuation.
In such case you simply state the function should return a variable and you state the variable type.
CREATE FUNCTION add(integer,integer) RETURNS integer AS 'SELECT $1 + $2;' LANGUAGE SQL IMMUTABLE RETURNS NULL ON NULL INPUT;
There are a few interesting things here which should be born in mind
In 005 when last we left our intrepid explorers we were wondering if having defined an inline table that contains the definition of the selection purely in the user defined function how do we see what that selection is as it might not be a presently defined object. SetOF references an object we can execute independently of the function Table() – does not.
Well apparently magically you can run the following.
And a notepad will open with the UDF code in it.
Bright people think of everything.!
Like all platforms it is possible to create bespoke functions in Postgres
For the following I assume;
1. Postgres 9.5 is installed with the server running (syntax should be the same for other versions)
2. A database called exampledb has been created
3. In this database there exists table called t001landparcels with some records and a field called PKID
4. You are logged into the exampledb with a username called general that has been granted CREATEDB role.
5. You are in psql
The following can be used to create a simple function
CREATE FUNCTION getrecords(int) RETURNS SETOF t001landparcels AS $$ SELECT * FROM t001landparcels WHERE pkid <= $1; $$ LANGUAGE SQL;
If you have been careful and done this exactly the same with my initial assumptions it should return
This tells us that the functioned has been created this will now exist in the schema the table is as a permanent addition.
We can identify the function by either listing all the functions and scrolling through
or listing the individual function
This should return your newly created function
Now to run the function – unlike MS Access you can’t simply run the function you need to allocate it to a select statement.
SELECT * FROM getrecords(2);
This should return everything you are looking for.
Now you should be able to drop the function using the following SQL
DROP FUNCTION getrecords(int);
Note how you have to define the function with its parameter I have read (no idea whether its true) that in version 10 of postgress you can simply use
DROP FUNCTION getrecords;
Writing code in psql does require accuracy so getting things to work does usually involve some experimentation. I have removed much of this from my screenshots!
An alternative is as follows
CREATE FUNCTION getrecords(int) RETURNS TABLE (pkid integer, parcelname text) as $$ SELECT pkid, parcelname FROM t001landparcels WHERE pkid <=$1; $$ Language SQL;
This appears to result in the same answer I am not clear what the difference is yet – note the result would have been the same if I had defined the table with the additon of a geometry column.
Note I dropped the old getrecords function before I created this one. Not sure what would have happened if I had tried to create one over the other.
I found this second method in stackoverflow when investigating functions with the following to me slightly mysterious quote
This is effectively the same as using
SETOFtablename, but declares the table structure inline instead of referencing an existing object, so joins and such will still work.
Which sounds to me as important but I’m struggling at present to understand its meaning!
I will investigate later but what I think he is talking about is that you don’t need to have a defined object before hand eg in setof this is
I quite often like to have things broken down rather than all in functions and it might prefer to have it as a setof something that I can more clearly see.
Which leads me to my next question how do you get the definition of a function see 006
For this you will need to have a version of Postgres Database engine installed and running and you will need to have created a database which has the PostGis extension installed.
Login to the database you wish to create the table in
type the following
CREATE TABLE t001landparcels (PKID SERIAL PRIMARY KEY, PARCELNAME VARCHAR(50), GEOM GEOMETRY(POLYGON,27700));
Here I do this and then check on the tables with the \dt command before inspecting the columns itself using the \d command.
and here I open up QGIS and link to my local postgres instance and the exampledb database;
and here I connect to it and draw a polygon. If you are wondering where it is this is InchKeith in the Firth of Forth and island very visible from George Street in Edinburgh. If you have flown into Edinburgh you will have flown almost over it.
and here after having digitised a single polygon I look at the contents of the table
SELECT count(*) FROM t001landparcels;
Produces the more helpful count of records in the table.
So you have an existing database in PostGres that you wish to add the PostGIS extension to.
You will need to be logged in with a username that has superuser privileges.
Here let me do this using the postgres default account that on my instance has superuser access.
Here rather than logging in at the start I am choosing to change the connection to a different database
This is done by using
Please note that the semicolon is vital in performing the instruction.
The way I have discovered to do this is by opening the database and then allocating the postgis extension to the database once it is open.
Let us first examine the tables in our blank database before the extension is enabled.
Now let us enable the extension.
CREATE EXTENSION postgis;
Let us now examine the database tables
Enabling the PostGIS extension adds this table along with the ability to add geometry types of varaibles within tables.
I am just getting into PostGres and here are some rough notes for my reference.
Assuming you have a postgres admin account you want to sign in first of all and create a database
To find the command line go to search in Windows and type psql
Ensure that your postgres engine is running firstly
You should be presented with the following
There are default connections to the local host keep hitting these initially until you reach the following;
You will now need to know your password enter it here and press return
I will deal with the warning message further in the post – but a lot of people experience this message so I wanted to keep it in at present.
From my initial investigations as ever it is a good idea to restrict users to particular privileges at this level I am not being particularly refined – I would like to have a poweruser role that I can allocate to people and give this a defined password.
Signing in you can check out the roles and users as follows – on starting up a new instance you may well see something like this \du
So here I try and set up a user called general and create a role called general which I give create DB rights
I would recommend something stronger than the classic password password.
Issuing the \du command again we get to see the roles
Now we can close down and go back in but this time login as username general by altering the appropriate item when asked.
Note how the =# characters have been replaced by => this appears to denote the non superuser sign in.
To identify what username you are logged in as type
\c at the prompt
My investigations suggest that the # sign denotes that you are logged into the database as superuser.
So first of all lets get rid of that annoying warning message when you log in at psql
I am running Postgres version 9.5 your version may vary but you can remove the warning by editing runpsql.bat file every version of postgres has this file and it will be located in the version equivalent directory to 9.5 for me.
Add the line
cmd.exe /c chcp 1252
as per underline and save the file
Now fire up psql as usual you should get the following
It should be noted that if you REM out the SET statements you can choose login with particular server / localhost / database / port and username presets which may be useful if you find yourself constantly going into a particular database as a particular user.
Here you see that the warning note has stopped.
It should be noted that using the general username you will NOT be able to create Databases
You can see that on carrying out a successful command we usually see a repeat of the command.
To get a list of all databases in the instance type
It can be seen that despite typing the name in capitals the database has been created in lower case this seems to be a feature of commandline. If you desperately want capitals you may need to go to pgadmin tool.
I think I’ll end it there for now.
Lets say we have two tables
A site parent table T001 consisting of two fields
And a planning application child table T002 consisting of three fields
We have the following records in the parent table T001
1 – Site01Green
2 – Site02Red
3 – Site03Blue
And the following related children in the planning application table T002;
1 – 1 – 2019/098
1 – 1 – 2018/032
1 – 2 – 2017/987
1 – 3 – 2015/100
1 – 3 – 2014/456
1 – 3 – 2014/657
And we would like to write a query that would create a list combining the two tables showing all the planning applications for each site listed in order.
This is the list we are aiming at
1 – 2018/032,2019/098
2 – 2017/987
3 – 2014/456,2014/657,2015/100
Using SQL Server 2017 or later (including SQL Azure) the following will work
CREATE VIEW [dbo].[View01Sitesandtheirplanapps] AS SELECT PKIDT001, STRING_AGG(PlanRef,',') WITHIN GROUP (ORDER by PlanRef ASC) as PlanRefs From dbo.T002 GROUP BY PKIDT001;
This can then be combined in a separate view that will include the sitename should it be required – this is a tremendous example the structure of which I will be using often.
As part of the general move towards the web I continue to investigate and learn about web development. An important aspect for any developer considering how to serve programs to clients and colleagues with as little resistance as possible – is Speed – users will be clicking these things potentially tens of times a minute and waiting to go from one screen to another signifcantly impacts their productivity. No wonder then we are hearing so many stories about dramatic improvements in site success by improving load speeds – but how to measure web site speed accurately? At work and for desktop applications I have resorted to downloading a stopwatch onto my android phone which can be quite useful if there are consistent and substantial differences in speed – still however a somewhat blunt and inaccurate tool.
So the other day I was again investigating how to better improve the delivery of web sites through the installation of web sites using the new Google Progressive Web Application paradigm.
I discovered within Chrome there is an Audit feature beneath the inspection option.
To use this open the web page you are interested in measuring using Chrome ensuring that it is a standard Chrome window (and not the PWA window)
Right click and then go for inspect then select the Audits option as shown below.
We see the statistics in the top right. From my initial running of this on several of my sites the metrics on
Progressive Web App
Seems to be fairly consistent in ranking sites.
Performance seems to vary every time you run it even if you are on the same page and url.
Here for example is me running the same audit literally five minutes after the last picture.
So all in all definitely an improvement in metrics but with some of the metrics varying so much from run to run it may still be better for giving a general indication of performance overtime than anything else. I have just upgraded this site to WordPress 5.0.1 although the theme is still from 2010. It should be noted my MS Access applications still transfer between forms within fractions of a second, so fast in fact that I am unable to measure them. Websites are getting better and there are sites now that are very fast. Still some way to go though before they can beat the blistering speed of desktop.
I have started looking at new themes for the site but I find I like a lot about this theme and am having trouble finding anything I am quite as happy with.
A simple function that will loop through and create strings that add a number to a simple string. This string will then be used to create update queries. The same could be done in Excel or any other spreadsheet option but this stores the queries nicely and neatly into a table.
In this instance you want to have created a table called
Which has the following fields
Public Function CreateSimpleStrings() Dim i As Integer Dim rs As DAO.Recordset Dim db As DAO.Database Set db = CurrentDb Set rs = db.OpenRecordset("T002ResidentialSearch") For i = 2 To 100 With rs rs.AddNew rs!ResidentialString = "Erection of " & i & " houses" rs!Houses = i rs.Update End With Next i rs.Close MsgBox "Complete" End Function
Here I am trying to automatically load log files into a MS Access file. Log files are actually txt files which need their extension changed if like me you wish to automate their import into MS Access. The following Visual Basic Script takes a file called FS_LS.log and changes it to FS_LS.txt. (I have another function that then takes this file and imports it into an MS Access application.
For safety this script creates a copy of the FS_LS log and renames it with a txt extension combining hours and minutes into the title. Thus it needs to be run one minute apart otherwise it will throw an error.
For testing purposes I have the following line
Fso.Copyfile "C:\Users\mark\Documents\vbs\" & vardatefile, "C:\Users\mark\Documents\vbs\" & vardatefile3
If you don’t have a feed to some kind of constant stream of log files this will copy the txt back to a log file ready to be copied again.(something I found useful for testing). Now next you want to call this from within your MS Access application prior to your data import function using the previously posted function.
Dim Fso Dim vardatefile Dim vardatefile2 Dim vardatefile3 Dim varhoursleft Dim varminsright Dim varnow varnow= FormatDateTime(now,4) varhoursleft = left(varnow,2) 'msgbox varhoursleft varminsright = right(varnow,2) 'msgbox varminsright Set Fso = WScript.CreateObject("Scripting.FileSystemObject") vardatefile2 = "FS_LS.txt" vardatefile3 = "FS_LS.log" vardatefile = "FSLS-" & varhoursleft & varminsright & ".txt" Fso.MoveFile "C:\Users\Mark\Downloads\FS_LS.log", "C:\Users\Mark\Downloads\" & vardatefile Fso.Copyfile "C:\Users\Mark\Downloads\" & vardatefile, "C:\Users\Mark\Downloads\" & vardatefile2 'Fso.Copyfile "C:\Users\Mark\Downloads\" & vardatefile, "C:\Users\Mark\Downloads\" & vardatefile3 Set FSO = nothing
A very simple function that will allow you to run any vbs script from within MS Access
Function RunVBS() Shell "wscript c:\users\mark\vbs\RenameFile.vbs", vbHide End Function
PS – UPDATE AT 10 NOVEMBER 2018 – I MAY TAKE THIS POST DOWN AS MY CHROME VERSION ON WINDOWS 10 NO LONGER GIVES ME THE OPEN IN NEW WINDOW TICK AND JUST CREATES A SHORTCUT ON THE DESKTOP NOW.
THANK YOU STACK OVERFLOW UPDATE appears others have figured out how to get round this see
PPS UPDATE AT 19 APRIL 2019 – Google have re-instated open in new window Icon. Great that common sense has prevailed and an interesting example of how quickly software development changes… Below should be broadly valid again.
Much like Google Maps changing the projection to a proper globe on zoom out. I noticed Google made a very slight change to their browser application which is a nice demonstration of how Google are constantly trying to improve their interfaces to make them more intuitive. At some point they altered the menu setup for creating web pages in application mode – I had written a post about the former method (see my post here) For those unfamiliar with this it is a really nice way of displaying web pages in what appears an application like window (ie no url bar or tabs). Comparing the two you can see that they have simply changed the menu from Add to desktop to Create Shortcut. Its a nice demonstration of the small improvements that are continually being made to the browser.
I had noticed that all my icons that I had formally created using the former application mode had stopped working.
Didn’t think much of it at the time but I think that must have been the date when Chrome changed over their code.
It should be noted that simply dragging the web page onto the desktop from the chrome browser window still gives the old windows shortcut what we want here is the Google shortcut. So here is how I do it.
Then select the following
Select create shortcut and a new dialog should appear
Name it and there you go
Now you get a nice icon on your desktop related to the site and what’s more when you open it up there is no address bar and within the taskbar you get the correct icon for the website see below example.
This is a really nice function that can be used to place all data from multiple CSVs (with the same structure) into a single table.
Here I use the Ordnance Survey’s excellent Code Point data set that gives postcodes in the UK along with eastings and northings as an example – This lists each postcode in the UK along with further administrative categories. Apologies to anyone from outside of the UK that may not be able to access these files I hope the demonstration is still useful. For those wishing to try pleased follow the links.
After download you will see the problem each postcode is in a separate CSV
After a short process to request the download including filling out your name you should be sent an email to download the data. This will consist of a zip file of two directories one named DOC one named DATA the DATA directory contains a subdirectory called CSV which at May 2018 for my download consisted of 120 csv files.
Opening a single file ( in this case Edinburgh eh ) we see
I’ve already figured this out here , but there are 10 fields here (some are blank in my example)
Here I create a table called T01CodePointCombined with 10 fields marked
F1 through to F10
Note if you don’t create the table this function is so powerful it will do it for you
I then create a module and ensure that all the CSV files I wish to import are in a single directory here “C:\Users\Mark\Documents\CodePoint\Data\CSV\”
Public Function ImportAllFiles() Dim strPathFile As String, strFile As String, strPath As String Dim strTable As String Dim blnHasFieldNames As Boolean ' Change this next line to True if the first row in csv file ' has field names blnHasFieldNames = False ' Replace C:\Users\Mark\Documents\CodePoint\Data\CSV\ with the real path to the folder that ' Now place the location of where the csvs are within the brackets below strPath = "" ' Replace tablename with the real name of the table into which ' the data are to be imported strTable = "T01CodePointCombined" strFile = Dir(strPath & "*.csv") Do While Len(strFile) > 0 strPathFile = strPath & strFile DoCmd.TransferText _ TransferType:=acImportDelim, _ TableName:=strTable, _ filename:=strPathFile, _ HasFieldNames:=blnHasFieldNames ' Uncomment out the next code step if you want to delete the ' csv file after it's been imported ' Kill strPathFile strFile = Dir() Loop MsgBox "Finished" End Function
Points to note make sure all csv are closed when you run it. That’s about it takes less than 5 minutes to move all the records from those 120 files into a single table within an MS Access Database.
After import if it’s gone correctly you should have in the region of 1.7 million records in T01CodePointCombined.
Pretty much as the title
Ensure you have a table called tblFiles with one field called Filename
Here I am finding all files in folderspec = “C:\Users\Mark\Documents\CodePoint\Data\CSV”
Alter as appropriate
Public Function ShowFolderList() Dim fs, f, f1, fc, s Dim rs As DAO.Recordset Dim folderspec Set rs = CurrentDb.OpenRecordset("tblFiles") folderspec = "C:\Users\Mark\Documents\CodePoint\Data\CSV" Set fs = CreateObject("Scripting.FileSystemObject") Set f = fs.GetFolder(folderspec) Set fc = f.Files For Each f1 In fc rs.AddNew rs.Fields("FileName") = f1.Name rs.Update Next Set rs = Nothing MsgBox "Finished" End Function