QGIS 2.8.1 now with ability to create SQL Spatial Indices

A work colleague noticed that it is now possible to create Spatial Indices on SQL tables through QGIS 2.8.1. Useful if you need to speed up a layer but you have no access to SQL Server instance. From approximate testing it would appear that if an index already exists then the spatial indices creation will fail.

This can still be done through SQL Server Management Studio if needs be see post here

CAGExample

CreateSpatialIndex

Posted in All, Digital Mapping, Geographical Information Systems, GIS, QGIS 2.8.1, QGIS General, SQL Server | Comments Off on QGIS 2.8.1 now with ability to create SQL Spatial Indices

Function to allow Parsing of Combined Fields

Here’s a nice function that will allow you to split up any combined field based on a given deliminator.
As example let us give a string that we might wish to investigate.

The string
Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge / West Lothian , United Kingdom

Public Function ParseDelimString(StrField As String, NumPos As Long, Optional StrDeliminator = ",")

Dim StrColumn() As String

StrColumn() = Split(StrField, StrDeliminator)
If UBound(StrColumn) < NumPos Then
ParseDelimString = ""
Else
ParseDelimString = StrColumn(NumPos)
End If

End Function

And Examples of result from entering differing parameters
StrField : Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge / West Lothian , United Kingdom
Function : ParseDelimString([Description],0,'/')
Result : Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge

StrField : Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge / West Lothian , United Kingdom
Function : ParseDelimString([Description],1,'/')
Result : West Lothian , United Kingdom

If you do not specify the optional deliminator it will assume that it is a comma.
StrField : Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge / West Lothian , United Kingdom
Function : ParseDelimString([Description],1)
Result : Longridge / West Lothian

Setting the number position to 0 gives you everything before the first instance of the deliminator from the left.
StrField : Site (51 Houses) (Carvill) to the west of Curling Pond Lane, Longridge / West Lothian , United Kingdom
Function : ParseDelimString([Description],0)
Result : Site (51 Houses) (Carvill) to the west of Curling Pond Lane

Posted in All, Data Cleansing, Database Design, VBA Code MS Access | Comments Off on Function to allow Parsing of Combined Fields

Using SQL to parse, clean and format strings

Many datasets can be somewhet confused by the time you get them. Maybe you had no control of the export from the database or maybe you asked for the right information and it came back somewhat warped.

SQL has powerful fuctions that can pretty much clean things up however you would like.

We can use multiple SQL commands within an MS Access module to clean up a source by placing them consecutively within a module here is the structure of some of the queries that I use.

UPDATE SELECTED FIELDS BASED ON A MATCHED STRING IN ANOTHER FIELD
SQL that updates Town and PostalTown fields based on a string in an aggregated PostalAdd field.
Please note that _ sign denotes a movement to another line within the VB Module required to make the SQL String run correctly. This may require alteration if you are cutting and pasting from this page.

Dim SQL As String
SQL = "UPDATE Table01 SET Table01.Town = 'Barassie', Table01.PostalTown = 'TROON' " & _
"WHERE (((Table01.PostalAdd) Like '*Barassie, TROON*'));"
DoCmd.RunSQL SQL

CONVERT A STRING FIELD TO ALL CAPITALS, ALL CAMEL CASE OR ALL LOWER CASE
The following SQL converts the street field of Table01 to all capitals. This could be run like the previous SQL from within an MS Access module

Dim SQL1 As String
SQL1 = "UPDATE Table01 SET Table01.Street = StrConv([Table01].[Street],1);"
DoCmd.RunSQL SQL1

In the above code change the trailing number parameter to select type of alteration
1 – ALL CAPIALS
2 – all lower case
3 – Camel Case

PARSE OUT LEFT PART OF STRING BY LOCATING UNIQUE CHARACTER OR STRING
This looks to the PostalAddress field of Table01 and searches from the left for a comma and returns everything to the left into a field called LeftParse

Dim SQL2 As String
SQL2 = "SELECT Left$([PostalAddress],InStr(1,[PostalAddress],",")-1) AS LeftParse FROM Table01;"
DoCmd.RunSQL SQL2

PARSE OUT RIGHT PART OF STRING BY LOCATING UNIQUE CHARACTER OR STRING
If you have a string with commas this string will count the length of the string then count the number of characters to your unique string – in this case a comma – and then return all characters from that string to the right of that comma.

The below code looks to the PostalAddressField of Table01 counts its length and then find the first comma from the right and returns the information as a select query result in a field named Right Parse. It should be noted that it searches through the target field searching from the left. IF there are multiple commas then it will stop counting when it hits the first comma. You can substitute the right part of the function with a number.

Dim SQL3 As String
SQL3 = "SELECT Right$([PostalAddress],Len([PostalAddress])-InStr(1,[PostalAddress],",")-1) AS RightParse FROM Table01;"
DoCmd.RunSQL SQL3

Posted in All, Data Cleansing, SQL MS Access, VBA Code MS Access | Comments Off on Using SQL to parse, clean and format strings

SQL Saturday #388 Edinburgh

SQLSAT388_web
Went along to SQL Saturday BI Edition on 13th of June.
Honestly I thought it was really great. Many thanks to Jenny Stirrup for being the event organiser. If you are thinking about going next year you really should.

This was the schedule for SQL Saturday in Edinburgh
http://www.sqlsaturday.com/388/sessions/schedule.aspx

Sessions I attended
Key Note with Jon Woodward – on long term future of computing in general
Web Scraping with Python – Iain Elder of Sky Scanner
Better Data Visualisation for CRM and ERP – Adam Vero
The European Economic Crisis and the Euro-A Data Tale – Carmel Gunn
Blue Sky Thinking : SQL Azure Geospatial Mashup – Thomas Sykes
Master Data Management – Dave Lawrence

All the talks were really thought provoking and nice to hear from people who are really at the top of the game. I have already started experimenting with Web Scraping and Python.

Posted in All, Continual Professional Development, Database Design, Dev Ops | Comments Off on SQL Saturday #388 Edinburgh

Updating Copyright Label in your QGIS Projects

Specific example QGIS 2.8.1 probably same in other versions.

Ensure you have copyright quote correct on QGIS projects

View / Decorations / Copyright Label

Posted in All, Configuration, QGIS 2.8.1, QGIS General | Comments Off on Updating Copyright Label in your QGIS Projects

Connecting to PostgreSQL 9.3 from QGIS 2.8.1 – local host

First ensure that you have both Postgres and QGIS installed on your machine.

In order for you to be able to connect to Postgres from QGIS on local host you must ensure 2 things. Firstly that the PostGIS plugin has been installed on your laptop AND secondly that you have included the postgis extension in each database that you wish to connect to. Without enabling the extension in the database you won’t be able to connect OR import shape files. Installation of PostGIS is often a default during the install of postgres but you can check whether this was completed correctly by using the Application Stack Builder, a small program that is installed with later versions of postgres.

I navigated to this on the win 8.1 machine I was using by using search.

Opening application stack builder you will be presented with the following.

ApplicationStackBuilder

Expand the spatial extensions tree to identify if you already have the PostGIS plugin installed – if not – select as appropriate the plugin and you will be prompted to install. You will need an internet connection for this. Above you can see that my plugin was already installed.

Next you need to add the PostGIS extension to each Postgres database you wish to link to from QGIS this is done through PG Admin.

This is something that both myself and a colleague got caught out by and it took me an hour of searching to find how to fix it.

Below I have a database called GISData which I have just created. You will note there is only one object within the expanded extensions tree. You will not be able to connect to a database that does not include PostGIS in its list of extensions
CreateExtension

Hi-light the database you want to spatially enable then go to Tools – Query Tool( Ctrl + E will do the same). In the above picture I’ve slightly jumped the gun. To add the extension to the database type.

CREATE EXTENSION postgis

Run the query by selecting the green right arrow
There will be a short delay and then upon refresh of the connection postgis should appear in the list of extensions.

CreateExtensionCreated

You can now close the Postgres administrator and return to QGIS where you should be able to setup the connection to the database.

Parameters should be similar to below and it is useful to test the connection prior to saving.

SettingupthePostGISconnection

Posted in All, Geographical Information Systems, GIS, PostGIS, Postgres, QGIS General | Comments Off on Connecting to PostgreSQL 9.3 from QGIS 2.8.1 – local host

Optimising Performance between QGIS 2.8.1 and SQL Server Express 08R2

I have just been through trying to optimise the speed of connection between SQL Server and QGIS and I have found two ways of speeding up the connection.

1) Create a geometry_columns table for the database which speeds up reading in the tables to the selection screen
2) Create spatial indices on individual tables which vastly increases the speed of draw of layers within QGIS

1) Create geometry_columns table on the database
When connecting SQL Server to QGIS if you don’t implement a geometry_columns table QGIS will slow down as it scans each table within the SQL Server connection to find the type (point polygon etc) and the SRID projection.

TSQL to create the geometry_columns table

CREATE TABLE [dbo].[geometry_columns](
[f_table_catalog] [varchar](128) NOT NULL,
[f_table_schema] [varchar](128) NOT NULL,
[f_table_name] [varchar](256) NOT NULL,
[f_geometry_column] [varchar](256) NOT NULL,
[coord_dimension] [int] NOT NULL,
[srid] [int] NOT NULL,
[geometry_type] [varchar](30) NOT NULL,
CONSTRAINT [geometry_columns_pk] PRIMARY KEY CLUSTERED
(
[f_table_catalog] ASC,
[f_table_schema] ASC,
[f_table_name] ASC,
[f_geometry_column] ASC
))

next for each table you will need to fill our the geometry_columns table as follows;

Here’s an example where I have a database called Corporate Address Gazeteer within which I have a table of planning applications (T001). Within that table the geometry is held in the Coordinates field

CreateGeometery_ColumnsTable

All tables within the connection (probably one database) will need to be recorded in this table. Provided the ‘Only look in the geometry_columns metadata table’ option is saved with the connection when connecting to a saved connection QGIS will only show those layers listed in the geometry_columns table. Below is the check box where QGIS is told only to look in the geometry_columns table (note server and database names are imaginary)

NewMSSQLConnection

2) Next speed up the draw on screen of SQL Server layers by creating Spatial Indices on each of the tables and ensuring that the database as a whole has a ‘spatial_ref_sys’ table.

TSQL to create ‘spatial_ref_sys’ table

CREATE TABLE [dbo].[spatial_ref_sys](
[srid] [int] NOT NULL,
[auth_name] [varchar](256) NULL,
[auth_srid] [int] NULL,
[srtext] [varchar](2048) NULL,
[proj4text] [varchar](2048) NULL
) ON [PRIMARY]

Next run the following TSQL on the ‘spatial_ref_sys’ table

INSERT into spatial_ref_sys (srid, auth_name, auth_srid, proj4text, srtext) values ( 27700, 'epsg', 27700, '+proj=tmerc +lat_0=49 +lon_0=-2 +k=0.9996012717 +x_0=400000 +y_0=-100000 +ellps=airy +datum=OSGB36 +units=m +no_defs ', 'PROJCS["OSGB 1936 / British National Grid",GEOGCS["OSGB 1936",DATUM["OSGB_1936",SPHEROID["Airy 1830",6377563.396,299.3249646,AUTHORITY["EPSG","7001"]],AUTHORITY["EPSG","6277"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],
UNIT["degree",0.01745329251994328,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4277"]],UNIT["metre",1,AUTHORITY["EPSG","9001"]],
PROJECTION["Transverse_Mercator"],PARAMETER["latitude_of_origin",49],PARAMETER["central_meridian",-2],
PARAMETER["scale_factor",0.9996012717],PARAMETER["false_easting",400000],PARAMETER["false_northing",-100000],
AUTHORITY["EPSG","27700"],AXIS["Easting",EAST],AXIS["Northing",NORTH]]');

This was obtained from EPSG link below – swap out your SRID if you are not from the UK
(Warning : Just be careful with pure cut and paste of above HTML. The code tag in wordpress is struggling to cope with all the brackets and commas and formatting is a bit off here best to cut and paste from below link)

http://spatialreference.org/ref/epsg/27700/postgis/

The great thing about running this is that you can place the projection in the sys table and users will not then need to select a new projection everytime they add a new layer an otherwise repetitive task. For the UK we only require one record within this table 27700 – if you have multiple projections you will need an entry for each individual projection.
I have slightly altered the reference obtained from the above link as it had placed a 9 infront of the first SRID reference – deleting it stopped the projection appearing for each user I have changed it in the above TSQL so it is correct but the graphic below still shows the 9 in front of the 27700.

CreateSpatialIndexPart6

For each table we need to create spatial indices.

Creating a spatial index on a table.

Go to the table in question and expand the table structure so that you can see the index on the table right click on index and select – New Index

You should be presented with the following screen
It should be noted that you choose Spatial for the type of index and in order for you to be allowed to do this the table should already have a clustered index field (in the picture example this is already in existance) and you will need to know the geometry column of the table in question. You give the spatial index and attach it to the geometry field in the table. In the graphic this is Coordinates

CreateSpatialIndexPart3

Next you need to give bounding box to the spatial index. The below figures are those used for West Lothian in this case
min x = 285380
min y = 651170

max x = 314890
max y = 680340

CreateSpatialIndexPart4

You can now press OK and the spatial index should be added to the table

If you have done it right when you expand the index field it should look something like this

CreateSpatialIndexPart5

Posted in All, Configuration, QGIS General, SQL 08 R2, SQL Server, SSMS | Comments Off on Optimising Performance between QGIS 2.8.1 and SQL Server Express 08R2

Setting up a Blank SQL Server Spatially enabled Table using Microsoft SQL Server Management Studio 2008R2 Express and displaying it in QGIS 2.8.1

Programs used;

1- SQL Server 2008R2 Express
2- SQL Server Management Studio 2008R2 Express
3- QGIS

The example uses UK national grids coordinates to create a Triangle Polygon in a SQL Server Table

I’ve previously written that while we’ve had spatially enabled SQL Server for over 5 years I constantly come across line of business applications that although using SQL Server have not and do not intend to spatially enable the application. This is undoubtedly because of the difficulty in re-designing legacy systems actively in use and because the benefits although significant are not generally requested by all but the most knowledgable of colleagues.

While I understand this legacy system reasoning spatially enabled databases are the future so its just a matter of when and not if an application will require alteration. Understanding it in this context makes it really a requirement to start seriously planning for its inclusion.

Developerers creating new applications however should always consider spatially enabling relevant tables from the start even if it is not specked by the client/colleague. It being so much easier to spend a couple of minutes future proofing the schema of a new born database rather than hours trying to retrofit a live in production back end.

Firstly it’s important to understand what a geodatabase in SQL Server actually is.
Really it is a normal database which has one table that has a field that has a geometry or geography value type. In this example I will use desktop QGIS 2.8.1 to display the resulting geometry but any other digital mapping package that can link to SQL Server could be used. SQL Server also has a very rudimentary Mapping Display but you will need something better if you want to manipulate boundaries visually.

Many digital mapping products have plugins that will create Geodatabases and tables however I haven’t seen one for QGIS. I really wanted to be able to create spatial SQL tables on my own without recourse to paid tools directly in SQL Server Management Studio. So here’s my method of creating blank polygon table whose geometry is ready to be read and edited in QGIS or any other digital mapping system just using SQL Server Management Studio Express 08R2.

Steps
1. Create a new Table
2. Ensure the table has an identity Key that increments
3. Create a geometry column
4. Write a query that updates the geometry column

UPDATE T001Sites SET Coordinates=geometry::STGeomFromText(‘POLYGON((301804 675762,295789 663732,309581 664870,301804 675762))’,27700)

You will note that there are four coordinates here (each coordinate being a pair of numbers )
The first coordinate and last are the same this is required by SQL to ensure that the polygon is closed.

The 27700 number is the Spatial Reference System Identifier (SRID) – it is a unique value used to unambiguosly identify projecttion. 27700 is the unique identifier for the United Kingdom National Grid and represents the coordinates my example refer to. The spatial reference identification system is defined by the European Petroleum Survey Group (EPSG) standard which is a set of standards developmed for cartography surveying etc and owned by the Oil and Gas Producers Group list here; http://www.epsg-registry.org/

The above coordinates display a triangle in West Lothian near Edinburgh

5. Set up the connection to SQL Server Instance

Ensure the box marked “Only look in the geometry_columns metadata table” checkbox is unchecked. By default this is checked and if the geometry_columns table does not exist you will get an error message.

QGIS-SSMS-Connection

6. Display the table and edit as appropriate.

Select the table then hit the Add button

QGIS-SSMS-TableReadyforDisplay

And here is the SQL Server table in QGIS ready to be added to or edited.
QGISshowingSQLServerPolygon

Posted in All, Configuration, Database Design, Dev Ops, QGIS General, SQL 08 R2, SSMS | Comments Off on Setting up a Blank SQL Server Spatially enabled Table using Microsoft SQL Server Management Studio 2008R2 Express and displaying it in QGIS 2.8.1

Spatially Enabling an existing Table – which has Eastings and Northing fields UK National Grid

The following TSQL will spatially enable a SQL Server Table called T1PlanApp when it has two Fields called East and North with UK National Grid References.

UPDATE T1PlanApp SET Coordinates=geometry::Point([East],[North],27700)

Posted in All, Database Design, Dev Ops, QGIS General, SQL 08 R2, SQL Server, SSMS | Comments Off on Spatially Enabling an existing Table – which has Eastings and Northing fields UK National Grid

Administering a Web Application some thoughts

In the last couple of years I have started administering a bought in Web Application from an outside vendor. I have found this really quite educational. I will not reveal the name of the product the vendor seems neither better nor worse than other vendors and many of the issues I am experiencing I see repeated with other admins of other web applications.

The software consists of various pieces of middleware linking SQL Server and the .Net and Javascript files. It uses a fairly standard Microsoft Stack to deliver a single page application. The big selling point was that that it was widely available through browsers pretty much anywhere on the web.

We have had it up and running in anger for a couple of years now and I have about 75 users all internal.

The vendor has created a new version and we now need to update. In discussion with the company they tell me that I should not be doing this myself and it is standard practice to get them in to upgrade the version. We already use it more as a stack technology to build our own applications I would want to continue this in the future. They have indicated that it will cost money to get them in to upgrade the software. This is triggering alarm bells because the existing configuration is very complicated, errors are quite regular and error messages tend to be pretty cryptic or non existent. I also like doing setup myself as it teaches me the ins and outs and speeds my learning up for systems which have non-trivial configuration.
A project board has been set up senior management are on board and well its grown legs arms and tails.

We have not widely publicised the application to the general public and they do not access the application at present.

The application itself requires significant amounts of configuration to set up correctly and parameters tend to be static and buried deep in very large HTML files. All this leads me to conclude that configuration in the new version is likely to be just as awkward. The company’s help desk is very good and there is extensive documentation but the sheer obscurity and quantity of configuration parameters makes the documentation the size of a phone directory, difficult and opaque.

Here’s some questions I should remind myself of when looking at New Web Applications in the future.
Are staff really so distributed that you require wide distribution?
Is it an internal or external distribution? – If not external do you really need a web enabled application.
Do you have internal expertise that is interested in developing applications?
With Web Applications or Local applications the backend database technology could well be the same. Internal local area development is likely to be far more flexible / cheap / quick and could motivate staff additionally it can be far more rich in its scope. It would be possible for instance to develop rich interface internally and a minimal interface externally using something like Lightswitch. This would take advantage of the strengths of both development environments.

There is a big push towards web applications I just think IT managers need to carefully consider their positions these applications tend not to be cheap and have a number of disadvantages that seem to have been pushed out of sight.

Posted in All, Configuration, Continual Professional Development, Dev Ops, General Opinion | Comments Off on Administering a Web Application some thoughts

Database Normalisation – a review of the rules to refresh my knowledge

Here is a review of the 6 forms of normalisation (yes the 4th is called 3.5 for some reason) – there are lots of summaries on the web of these, but I wanted to re-write them in my own words to ensure I understand them fully. Revision of these post real world experience is particularly enlightening.

Normalisation rules are cumulative ie a database or set of tables can be said to be 1 through to 5 compliant. To rank at the highest level it must comply with all previous forms.

That is to say that a database (or table) to be in the 2nd Normal Form must first fulfill all the criteria of the 1st Normal Form.

A database is said to be properly normalised if the first three normalised forms are inviolate.

Violations of normalisation forms subsequent to the third normal form may be legitimately required and as such are NOT absolutely required for a database to be classified as properly normalised.

Some web explanations do not list forms post 3 I would argue there is still a lot of value in trying to adhere to these less well forms but must admit to not always including them in design usually due to poor knowledge of the application domain.

The differing forms of normalisation start simple becoming increasingly complicated.
If you have a lot of experience in creating relational backends you may have deduced these issues for yourself.

(1NF) First Normal Form – Edgar F Codd – first stated 1970
Eliminate duplicate columns from the same table.
Create separate tables for each group of related data and identify each row with a unique column or set of columns (the primary key).

(2NF) Second Normal Form – Edgar F Codd – first stated 1971
Create relationships between these new tables and their predecessors through the use of foreign keys.

(3NF) Third Normal Form – Edgar F Codd – first stated 1971
Remove columns that are not dependent on the primary key.

My example take on this is that calculated columns should NOT be included in tables

So here imagine you have an invoices table with the following
InvoiceNo (PKID)
CustomerNo (FKID)
DateofPurchase
UnitPrice
Quantity
Total

So according to the first and second normalisation forms there are no duplicated forms and the FKID customer no shows that there are relationships. However a Total figure can be calculated by multiplying unit price by quantity and as such violates the 3rd Normal Form.

The new table would resemble the following
InvoiceNo (PKID)
CustomerNo (FKID)
DateofPurchase
UnitPrice
Quantity

Total is not stored and would be calculated on the form. It is noted that sometimes you wish to store things like titles because for instance unit prices may vary over time and you don’t record want to loose the overall total by updating unit price – especially unit price is referenced from another table. A similar argument relates to things like VAT which can be changed on a yearly basis. In such a case I would argue that you still don’t break the 3NF as the value of column is not dependent on the primary key it is dependent on the value of the VAT rate at any given time.

The 3.5 Normal Form or the Boyce Codd Normal Form
This is a slightly stronger version of the third normal form developed in 1974 by Raymond F. Boyce and Edgar F. Codd to address certain types of anomaly not dealt with by the 3rd Normal form as originally defined.

Only in rare cases does a 3rd Normal Form table not meet the requirements of Boyd Codd Normal Form. A 3rd Normal Form table which does not have multiple overlapping candidate keys is guaranteed to be in Boyce Codd Normal Form.

Web investigation indicates that the most concise description is;
Every determinant must be a candidate key

and that Boyd Codd Normalised Form is not always possible

Nearest Shops
Person Shop Type Nearest Shop
Mark Jewellers H Samuel
Mark Barbers Mr Man
Natalia Coffee Shop Starbucks
John Coffee Shop Costa
John Barbers Mr Man
John Jewellers H Samuel

The issue above is that Mr Man on record 2 could be changed to a Hairdressers for shop type but the second last record John would not change and as such it would seem to be inconsistent. In this case I would split this table into two with Nearest Shop and Shop Type being linked and there being some kind of table that links People to Nearest shops.

The 4th Normal Form
A table is in the 4th Normal Form if it has no multi-valued dependencies.
Definition: Multivalued dependencies occur when the presence of one or more rows in a table implies the presence of one or more other rows in that same table.

For example lets say you have a you have a shop selling coffee – lets call it Starbucks and you have a table that lists the permutations of cofee sold in its store
CoffeeShop Coffee Product Location
Starbucks Café Late Livingston
Starbucks Americano Livingston
Starbucks Café Late Edinburgh
Starbucks Americano Edinburgh

If we know that all Starbucks sell the same products if there is a new product in one shop then we would expect that new product to be in the other shops so adding a Skinny Late to the Livingston shop implies that there should be another record for Skinny Lates in the Edinburgh shop.

To eliminate the possibility of anomalies we must place the facts about the products and the location of the shops into two tables

Products
Starbucks Café Late
Starbucks Americano
Starbucks Skinny Late

Location of Coffee shops
Starbucks Livingston
Starbucks Edinburgh

Alternatively if the coffee types did vary from location to location the original table would satisfy 4NF

The 5th Normal Form
A table is in fifth normal form or Project-Join Normal Form if it is in the Fourth Normal Form and it cannot have a loss-less decomposition into any number of smaller tables.

Or as I like to call it the – I know about it and try to adhere to it but sometimes I’m a bit lazy form.

I certainly try to keep my designs to the full six normalisation principles but there are instances where certain tables do not comply.

Posted in All, Continual Professional Development, Database Design, My UI Design Patterns, Personal Development | Comments Off on Database Normalisation – a review of the rules to refresh my knowledge

AutoHotKey : Navigation between Satellite Applications to improve Work Flow

A while back I wrote a post about how allowing parameters to be passed to URLs is a big benefit in increasing the speed with which you can navigate to individual records in apparently non-connected web applications.

But what do you do if you are faced with a satellite application whose vendor has not implemented this URL friendly facility. Users are left with the very jarring break to the flow of their work when they have to leave the application they are in and navigate to another application sometimes manually having to link to the other application records form via a search field. This searching task when multiplied many times can be really tedious, repetitive, demotivating and time consuming not to mention pointless.

How can we better serve our users?

The other day I came across an open source program called AutoHotKey that allows me to improve this task.

AutoHotKey

Autohotkey is an open source project that allows the creation and compilation of simple or complicated scripts that can be used to navigate anything on a computer. That means desktop OR web applications. The following is something that I worked out last week to be able to navigate a web application by triggering a script from MS Access vba. The great thing is that you can pass parameters from a database application to a middle layer and trigger a set of commands to be run.

Let us take the example of a recent problem I faced. Many councils throughout the United Kingdom have bought an application from a company that manages the information associated with making planning applications, it consists of both desktop and web applications that help manage the submission and decision making associated with development. The vendor recently “upgraded” the application resulting in it no longer accepting planning application numbers to its URL as a method of going straight to the record. This was meaning that users of one of my satellite applications were faced with being dropped into a search screen and then needing to manually type a field from one application into the field of another application. QED dull and repetitive task.

There follows and overview of my solution. Firstly download the following programs
1)AutoHotKey

AutoHotKey

2)iWB2 Learner – which is a small program for identifying element names and id in INTERNET explorer.
iWB2 Learner
iwebbrowser2 Download

My script for Autohotkey was as follows.

FindPlanningApplication.ahk (written in plain old notepad and saved to a known location with the suffix changed to ahk)
=====================
APPLICATION = %1%

URL := "https://planning.westlothian.gov.uk/online-applications/"

WB := ComObjCreate("InternetExplorer.Application")
WB.Visible := True
WB.Navigate(URL)
While wb.readyState != 4 || wb.document.readyState != "complete" || wb.busy ; wait for page to open
Sleep, 10
wb.document.getElementById("simpleSearchString").value := Application
wb.document.getElementsByTagName("INPUT")[4].Click()
While wb.readyState != 4 || wb.document.readyState != "complete" || wb.busy
Sleep, 10

return
===================

Using iWB2 Learner to identify the element names on the web page
This video shows iWB2 Learner being used it unfortunately does not have any sound.

VIDEO Using iWB Learner with AutoHotKey

—-
Next you will need to trigger the AHK – You will need design access to the program that is sending the instruction to do this. In my MS Access application I have the following code that triggers the script in the above.

Private Sub Command43GoToApplicationOnlineRecord_Click()

Dim strPlanApp As String
Dim strAHKname As String

strPlanApp = "LIVE/" & Me.ApplicationNo

strAHKname = "\\server1-cluster\ahk\FindPlanningApplication.exe"
Call Shell(strAHKname & " " & strPlanApp, vbMaximizedFocus)

End Sub

Notes:
The computer that holds the AHK script need not have AutoHotKey installed if it doesn’t you can compile your script into an executable that will not require installation. Here I created the executable on another computer and transferred it to the \\server1-cluster\ahk location ready to be called by the VBA

Consecutive parameters passed to Autohokey are consecutively named %1% %2% etc.. In my script I pass the planning application as %1% and rename it APPLICATION immediately.

Compiling the AHK is done by moving to a computer with AHK installed and navigating in Explorer to the file and then right click and Compile will be an option. Note the processor architecture is important when compiling. If your target machine is 32bit then you need to compile on a 32bit machine – same with 64.

Posted in All, AutoHotKey, Productivity, Programming, VBA Code MS Access | Comments Off on AutoHotKey : Navigation between Satellite Applications to improve Work Flow

VBA function to Pivot and Concatenate Child records

In situations where a Parent record has a limited number of children (0 to 10 works well) and you would like to list those children next to the parent somewhat like you would with a Pivoted table. Pivoting the table would however result in a massive table which is extremely wide. Pivoting and concatenating the fields can keep the resulting list within a manageable width. Take for example the following table.

ParentTable

So we have table with four header records and the children all relate to those parents. For each parent obtain a list of children

Firstly place the following function in a module

Public Function Conc(Fieldx, Identity, Value, Source) As Variant
Dim cnn As ADODB.Connection
Dim rs As ADODB.Recordset
Dim SQL As String
Dim vFld As Variant

Set cnn = CurrentProject.Connection
Set rs = New ADODB.Recordset
vFld = Null

SQL = "SELECT [" & Fieldx & "] as Fld" & _
" FROM [" & Source & "]" & _
" WHERE [" & Identity & "]=" & Value

' open recordset.
rs.Open SQL, cnn, adOpenForwardOnly, adLockReadOnly

' concatenate the field.
Do While Not rs.EOF
If Not IsNull(rs!Fld) Then
vFld = vFld & ", " & rs!Fld
End If
rs.MoveNext
Loop
' remove leading comma and space.
vFld = Mid(vFld, 3)

Set cnn = Nothing
Set rs = Nothing

' return concatenated string.
Conc = vFld
End Function

Now set up a Query and call the function in an expression

QueryChildrenDesign

This results in the appropriate list

GroupChildrenbyParents

Posted in All, VBA Code MS Access | Comments Off on VBA function to Pivot and Concatenate Child records

Save Record before Event

If users are editing or creating a record and there is an option on the form to print out the form if they press it they will expect the information that they have just created to appear on the print preview. Normally MS Access does not save the information to the database until the record is exited or specifically instructed. This can lead to user confusion when they enter a record hit print preview expecting to see the record and see a blank preview.

Placing the following code before calling the print preview will ensure that the record is saved prior to the print preview being triggered resulting correct information being displayed in the print easy. An easy fix – you should always do the easy wins.!

Note order is important, put this before the event you are wishing to trigger (quite ubiquitous in my code)

DoCmd.DoMenuItem acFormBar, acRecordsMenu, acSaveRecord, , acMenuVer70

Its the constant small touches that make great applications

Posted in All, Database Design, MS Access, VBA Code MS Access | Comments Off on Save Record before Event

Appropriate User Interfaces

I have three guitars 10 windsurfers and 4 bikes (5 if you count a static exercise bike) I have three regularly used computers (tablet , desktop and laptop) – four if you count my phone. My bikes are not duplicates one is for racing, one is for training, one is for commuting and the final one for mountain biking. I often use my commuting bike instead of my car. I try my hand at DIY and have got to the point of building a cupboard containing drawers only for tools I have so many tools I cannot list them all from memory.

I get the impression that companies are pushing for all applications to be pushed to the web. Fine but if you have invested in an amazing building with an amazing Local Area Network why push absolutely everything to the web? Milk your assets often the longer you leave it the more stable the web environment will be – it still continues to experience massive change. Web has its advantages but flexible cheap very rich User Interface design doesn’t seem to be it. The benefit of database driven applications today is that the backend is totally independent of the User Interface. Developing an internal local area based UI can be totally independent of building a web based user interface.

Carefully consider and don’t be afraid to use the full range of tools available to you get the backend right and then work on a very very usable front end. Users want quick usable design that fit their needs they really don’t care about the technology.

Posted in All, General Opinion, Productivity, Project Management | Comments Off on Appropriate User Interfaces

Forcing or Limiting Columns Using Cross Tab Queries

MS Access has a version of SQL that can be edited to enforce specific columns being shown in queries.

Within the query editor navigate to the SQL View

Normally when a cross-tab query is selected MS Access will generate columns for every value that exists. This is often very useful but if there are no values there may be gaps in the scale. Additionally you may wish to limit the number of columns that are displayed in the query or force limited numbers to tally with a report. Some reports will crash if certain columns are not found.

Generally the last line of SQL code within a query will follow a similar syntax to the following;

PIVOT [QueryName].YEAR;

Where YEAR is an actual field name that could alter in your circumstances

Altering this to the following will for instance generate years 2000 to 2010 with no gaps even if there are no values in certain columns.

PIVOT [QueryName].YEAR in ("2000","2001","2002","2003","2004","2005","2006","2007","2008","2009","2010");

Posted in All, Database Design, MS Access, SQL MS Access | Comments Off on Forcing or Limiting Columns Using Cross Tab Queries

Using a Crosstab Query to produce Yearly Monthly or Daily breakdowns on a child table

Lets take an example where you have two tables with a one to many relationship. In my example I have a table of costcodes and a table of costs with the table of costs being a child of the table of costcodes. How do you create a query that will show you the costs by year by costcode.

Cross-tab query.

CrosstabYearlyReceiptsQueryDesign

This will give

CrosstabYearlyReceiptsResult

To alter the column headings choose the differing alternatives as follows;

For years = Format([DateActualSet],”yyyy”)

For months = Format([DateActualSet],”yyyy-mm”)
*Please note need to use – as a separator between yyyy and mm otherwise sorting may go off.

For days = Format([DateActualSet],”dd”)

And for financial periods you can alter it to a specific month by using the following

For Financial Periods = Year(DateAdd(“m”,-3,[DateActualSet])) – note in this case the year that appears in the start column is the start of the Period -3 Denotes 2015-April to 2016-March but appears in the column heading simply as 2015

This format would work well for example for Stores against receipts.

Posted in All, Database Design, MS Access, VBA Code MS Access | Comments Off on Using a Crosstab Query to produce Yearly Monthly or Daily breakdowns on a child table

The Handover – where the dragons lie

woman-see-dragons-everywhere-2

The handover – often someone figures out that something needs to be done – maybe they have realised an important omission or maybe there is a problem that only comes to light in a process when you start designing.

For whatever reason you can’t do the work yourself and you need to hand the work to someone else. This I consider to be a dangerous task the threat of information loss is immense. Here are some points I try to ensure to minimise this ever present risk.

Give it to an identifiable person – I always find it curious how top management consider it important that they know each other and feel comfortable in each others presence they carefully pick their colleagues and treat them with respect by ensuring accurate names are used and generally constantly talking to each other often informally and flexibly. However I have seen systems implemented by management which actively discourage these same forms of informal communication and actually discourage personal responsibility.

Give that person tasked responsibility for the whole task and allow them the flexibility to tackle it how they see fit. People adjust processes so that they are more productive and will use their personal skills in the workplace often when they were not hired on that basis. A regular example of this is probably individuals who can speak multiple languages. I have seen several situations where a customer from another nation arrives and someone in a team happens to speak their language in a flexible environment that person often gets involved in that task simply because good communication is the first step in tackling a problem. I was in the Premier Inn in London and I was pleased to note that the staff all had badges on them noting not just their names but the flags relating to languages they could speak. It was nice to see waiters with multiple flags on them. Gave you just a small bit of background on the person and encouraged customers of particular nationalities to seek out correct staff.

Give it to someone who is familiar with the domain (scope of work) complicated tasks need not just technical skill but domain specific skill. Most people know how a booking systems works as nearly everyone is familiar with booking things online. But would you expect a general programmer to be familiar with the process of registering births marriages and deaths or submitting tax returns for instance. If someone doesn’t have domain specific skills give them extra time.

Close physical proximity is a bonus – so called water cooler conversations are often about work even if it may stray into what might be considered gossip. For some even then gossip may inform you of particular tender subjects that allow you to better consider the respect of your colleagues and better conduct yourself in conversation to get the optimum of work between individuals. It also allows for constant review of process and answer of questions when they arise. I prefer in e-mails to ask single questions at a time. I find that if you store up questions in a massive e-mail the first two questions are asked and the others are just omitted. As far as I can tell this is a human trait. Its my experience that when I sit close to individuals I ask questions when they arise which when answered usually completely negate the need for further questions. This uses comparative advantage which is always a bonus.

Concentrate on the outcomes not the technology that they are using especially if there are no costs to something let them use what they want. Databases are databases – spreadsheets are spreadsheets – they prefer the dvorak keyboard to qwerty keyboard good for them. For some spreadsheets are better for them than databases others will always choose databases – Does it really matter. I have seen some really excellent spreadsheets for tasks that I would have preferred to use databases. You could argue one over the other for particular tasks and maybe even prove that one was better than the other on somekind of mathematical basis. But would SQL Server really be better for your granny than a spreadsheet. NO.

Ensure that the individual will have a continual link to the system post development – he should recognise that the stability and the continuation of the system is as important as simply pushing something out the door.

At least initially continually check up on the individual and ensure that things are progressing as intended. This requires time at the beginning but you should be able to wind it down after that.

Handovers are fraught with danger don’t make them be fraught with danger for you.

Posted in All, Configuration, Dev Ops, General Opinion | Comments Off on The Handover – where the dragons lie

Business Continuity

I see Business Continuity frequently evidenced as a reason by management to disallow staff responsibility for their business systems. The belief being that individuals will become too central to the operation of the organisation and will become somehow irreplaceable.

For the record I would like to state that I consider this to be failed logic.

Parents with children will encourage them to take responsibility by trusting them with tasks. Why is this important? Why? Because one day the parents won’t be there to support their children who will have to stand on their own two feet and it is better for the offspring to make mistakes with the parents there to catch them than to wait till the parents are gone and then for them to make the same mistakes. Many parents consider that their children do things in inefficient ways nonetheless they continually trust them. The world continues to progress and the human race continues to be the most successful species on the planet (hands down). It is the mechanism of passing knowledge down from one generation to another. So giving responsibility and trusting children far from decreasing continuity has a long history of ensuring it.

If anyone uses business continuity as a reason to prevent individuals from being given responsibility you might want to highlight the above point to them.

Posted in All, Continual Professional Development, Dev Ops, General Opinion | Comments Off on Business Continuity

International Domain Names

I was helping to set up a domain for a friend the other day and I wondered whether they would like a domain in Cyrillics. Led to the question was that even possible.

I try to be aware and present but things pass me by sometimes. Particularly because I don’t really watch television. I have already posted about Cyrillics not being supported in some visual basic editors well turns out Cyrillics were not supported for URLs until 6th of May 2010.

It would be interesting to see the take up of Cyrillic URLs I note that the Cyrillic domain of Yandex url яндекс.рф re-directs to a latin version

One interesting url is the Russian President
президент.рф

Some quick research has revealed that DNS convention is old (no real surprise there) and only supports the 26 Latin characters A through Z numbers and the dash. When a non-latin based URL is placed in the address line the alphabet is rencoded to a system called Punycode which is a way of representing a domain name with a non-DNS character set within the DNS character set. These domains are called International Domain Names (IDNs) If you wish to display IDNs properly within the address bar you need to go to Chrome settings and select the appropriate language relating to the character set of the IDN the url should appear correct otherwise you get a strange punycode translation that might be mistaken some weird non-base ten numerate system.

Doubtless Yandex did not consider this attractive as many peoples first reaction to that kind of URL would be to think that they had been re-directed to a dodgy website.

Posted in All, Configuration, Русский | Comments Off on International Domain Names