If you use the SQL Server 2016 or larger (even the kostfree Express Edition !!) as backend for your processing file relevant data, the automatic creation and management of the DSGVO necessary processing file is not child's play, but still comparatively easy to handle.

For those customers where I'm currently processing DSGVO-related data, I'm using SQL Server 2017 Express everywhere.

For each DSGVO relevant table, I additionally used a so-called Temporal Table. Although this increases the storage space required, it shouldn't matter these days. It is very easy to handle overall. If you now set the mandatory two additionally required date fields to "Hidden", you do not even have to change the existing program.

Cumulative Update 3 (CU3) for Microsoft SQL Server 2017 was also released in the January 3, 2018, SQL Server Security Bulletin ADV180002. See KB 4058562for more information. Because of this, you may already have CU3 installed as part of that security release. If you do try to install CU3 after ADV180002, you may receive the following message:

There are no SQL Server instances or shared features that can be updated on this computer

This indicates that CU3 is already installed and that no further action is required.

With my current main customer, I am currently working on converting MS Access Backend (bound forms) to SQL Server 2017 Express (unbound forms) and will probably do some things directly with ADO .... I tested the following (unsorted), I thought that might be interesting ...

My procedure was: With the SQL Server Migration Assistant 7.6 for Access (see separate blog entry here) the original data of the customer copied into its own (temporary) xx_IMPORT database. Create an empty copy of the _IMPORT DB. Edit this empty database with VS until it suits your needs (including Rowveresion <aka Timestamp> for all tables).I declared this file as a basic version and made a backup. Then the data is copied over via the change script of the SSDT. This filling process can be repeated as often as desired with newer data ...

-----------------------------------

To test the performance I use the new SQL Server 2017 Query Store. The Query Store is very simple in installation and operation.I find it interesting, that you can also call the free SentryOne Explorer directly from the SSMS (17.4) and link with the Query Store plans. But I also like the possibilities of visually comparing two ExcutionPlans directly.

As far as I understand it, now the Extended Events and the whole debug rubbish (which was always very suspicious to me and I never really used it) are probably not so urgent <if ever> needed anymore. What I miss is a tip on how and when to best set the default settings of the Query Store, as I have no up-to-date production data (over time) ... I think that for smaller databases rather a shorter interval, but also "only" one week to save historical data makes sense.A very good English introduction to the Query Store can be found here (https://www.youtube.com/watch?v=HZIkOGDpq3E)

----------------------------------

Also the new Temporal Tables I'll probably use in two places, because the customer asked me to get a change history for articles and customers ... (Especially since the Temporal Tables, if you consider the restrictions, really very easy to implement and use)

Convert to Code creates a VBA-wrapper for SQL Code which is extrmely useful for SP called from VBA

For the Convert to Code I have generated the "Custom language" VBASave VBAEscape Charachter "Copy code with less than 60 chars as a single lineCode beforeDim sql As stringsql = "

Code after"

Line terminator"_ & "

-------------------------------------

Two things that are still missing in Express Edition are the automatic compression of backups and a dedicated task manager. The task manager of Windows is a horror. However, there are Powershell scripts that make it easier to set up. Information can be found in my SSIS blog.

Regarding compressed backup: Comparatively, I once backed up the same database uncompressed within the Express Edition and one compressed in the Dev Edition. The result is rather interesting, i think:AdventureWorks2012 BackupUncompressed 194,523 KB7Zip of the BAK 24,866 KB

As a test, yesterday I successfully tested a Python script with the Express Edition. If I manage to get the Launchpad running twice (also for the Dev instance), then it works with the Dev ... (I have both Developer and Express Edition installed on my system).

Launchpad for used Instance must be running, Phyton and R support installed, and External Scripts enabled ...

What generally is unknown, if you click SSMS in Object Explorer (on top) into the Database-Connect and there click with the right mouse button on Restart, then (after a separate confirmation in an extra window) the SQL Server (or Express ) Service is restarted, without having to open the Configuration Manager extra ... I find it super handy.

New in the version SSMS 17.4 (right mouse click under Tasks of a database) is the so-called "Vulnerability Assessment" which generates a vulnerability list. Well, for the one who needs it <shrug shoulders> ...

----------

The Import / Export Wizzard however, included since 17.3 is much more practical, especially because it enables you, in conjunction with the Data Tools of Visual Studio (2015 od. 17) from the Wizzard to generate a script and then process it with VS + SSDT. As you can also run SSIS scripts under the Express Edition (as explained in a separate blog), the whole thing runs really smooth ...

Idea behind: Using an indiviual ADO Connection-String per user with VBA is simpler than passing the computer\user each time as parameter. So one is able to have the same information for "computer\user" even if not working with Trusted_connection but with "normalized" login-strings, without transmitting each time explicit user-name parameter ... Additionally it eases the debugging a lot as you know exactly which user etc ...

Unlike some statements, it is quite possible to manage legally SSIS packages within an SQL Server Express edition:

You develop a package and deploy it to the SQL Server Express and run the package using a stored procedure.

For SSIS, one generally must distinguish between package development, package deployment, and package execution.

In the directory in my onedrive all necessary scripts are stored in the subdirectory SSISIn a further subdirectory there are other important links to SSIS and PowershellAs the entire directory SSIS is very small, it is recommended to download the directory as a whole.For easier handling, all scripts have an additional .txt ending ...

SSIS Development

SSIS packages are developed with Visual Studio.Minimum requirements for development are:

Beginning with SQL Server 2012, you can deploy existing SSIS packages in SQL Server to the special catalog database named SSISDBand from there start the packages with own stored proceduresMinimum requirements for this are:

a) One-time: SQL script "SQL SQL Server set to CLR enabled.sql"because for the following script the server must be CLR enabled = Yes

(b) for each new package:A Powershell script that creates the Catalog database in the SQL Server Express and deploys the packages into the Catalog database"PS SSIS Catalog and Project Deployment with PowerShell ___ As_Admin_Execute.ps1" as AdminExplanations Inline

c) On larger Editions of SQL Server you have "There is a checkbox when you create an SSIS catalog that says "SQL Server startup" which is missing in Express Edition.Therefore - One-time (SSISDB already must exist): SQL-Script "SQL SET SQL Server Enable automatic execution of SSIS Startup SP.sql" to set this checkbox to ON via SQL

Helpful: SQL script "SQL SET XP_CommandShell_Ole Automation Procedures_Start.sql" to allow XP Commandshell (not necessary)Helpful: An SQL script that helps you run the SPs

SSIS execution

The packages located in the SSISDB database are executed via stored procedures.Because the SQL Server Express does not have its own scheduler, it is recommended to use the task manager built into Windows for automatic tasks.In order to automatically schedule things, it is recommended to run Powershell scripts in the task manager ...I find a link (on the desktop) to the task manager helpful: