Showing posts with label SQL database. Show all posts
Showing posts with label SQL database. Show all posts

26 February 2025

💎🤖SQL Reloaded: Copilot Stories (Part II: Create a Schema for Customer Addresses)

Since the first post on a related topic, Copilot became available in Microsoft 365, and this opens the door for prompt engineering experiments. A prompt is a "natural language instruction that tells a large language model (LLM) to perform a task" [1], process also known as instruction tuning. Prompt engineering is thus "the process of creating and refining the prompt used by the model" [1].

For example, one can use Prompt Engineering to learn how to perform certain tasks in a programming language like SQL: learning how to define a table, write a query and database objects, etc. In a first experiment I created a set of prompts for creating a set of objects to store and perform CRUD (Create, Read, Update, Delete) operations. The experiment focused on Customer Addresses, Countries, Regions. Let's look at the prompts and their outputs. The following prompts depict the attempts in the order in which they were considered, so feel free to experiment!

Before Copilot, I used macros in Excel or stored procedures to generate similar code. which allowed me to reduce considerably the development time.

Prompt:
Give me the definition of a SQL Server stored procedure for updating the Customers from the dbo schema based on input parameters. 

Output:

CREATE PROCEDURE UpdateCustomer
    @CustomerID INT,
    @CustomerName NVARCHAR(100),
    @ContactName NVARCHAR(100),
    @Country NVARCHAR(50)
AS
BEGIN
    UPDATE dbo.Customers
    SET 
        CustomerName = @CustomerName,
        ContactName = @ContactName,
        Country = @Country
    WHERE 
        CustomerID = @CustomerID;
END;

Prompt:
Give me the definition of a SQL Server stored procedure for inserting Customers in the dbo schema based on input parameters.

Output:

CREATE PROCEDURE InsertCustomer
    @CustomerName NVARCHAR(100),
    @ContactName NVARCHAR(100),
    @Country NVARCHAR(50)
AS
BEGIN
    INSERT INTO dbo.Customers (CustomerName, ContactName, Country)
    VALUES (@CustomerName, @ContactName, @Country);
END;

Prompt:
Give me the definition of a SQL Server stored procedure for deleting a Customers from the dbo schema based on Customer Id.

Output:

CREATE PROCEDURE DeleteCustomer
    @CustomerID INT
AS
BEGIN
    DELETE FROM dbo.Customers
    WHERE CustomerID = @CustomerID;
END;

Prompt:
Give me the definition of the SQL Server table used for the above examples.

Output:

CREATE TABLE dbo.Customers (
    CustomerID INT IDENTITY(1,1) PRIMARY KEY,
    CustomerName NVARCHAR(100) NOT NULL,
    ContactName NVARCHAR(100),
    Country NVARCHAR(50)
);

Prompt:
Extend the definition of the previous table with Address information. 

Output:

CREATE TABLE dbo.Customers (
    CustomerID INT IDENTITY(1,1) PRIMARY KEY,
    CustomerName NVARCHAR(100) NOT NULL,
    ContactName NVARCHAR(100),
    Country NVARCHAR(50),
    AddressLine1 NVARCHAR(200),
    AddressLine2 NVARCHAR(200),
    City NVARCHAR(100),
    State NVARCHAR(100),
    PostalCode NVARCHAR(20)
);

Prompt:
Remove the Address information from the above table into a separate Addresses table with the Customer Id as foreign key.

Output:

-- Modified Customers Table:
CREATE TABLE dbo.Customers (
    CustomerID INT IDENTITY(1,1) PRIMARY KEY,
    CustomerName NVARCHAR(100) NOT NULL,
    ContactName NVARCHAR(100),
    Country NVARCHAR(50)
);

-- New Addresses Table:
CREATE TABLE dbo.Addresses (
    AddressID INT IDENTITY(1,1) PRIMARY KEY,
    CustomerID INT,
    AddressLine1 NVARCHAR(200),
    AddressLine2 NVARCHAR(200),
    City NVARCHAR(100),
    State NVARCHAR(100),
    PostalCode NVARCHAR(20),
    FOREIGN KEY (CustomerID) REFERENCES dbo.Customers(CustomerID)
);

Prompt:
Create a Countries table that can be referenced from the Addresses table with Country ISO codes.

Output:

--Create the Countries Table:
CREATE TABLE dbo.Countries (
    CountryCode CHAR(2) PRIMARY KEY, -- ISO 3166-1 alpha-2 code
    CountryName NVARCHAR(100) NOT NULL
);

--Modify the Addresses Table:
CREATE TABLE dbo.Addresses (
    AddressID INT IDENTITY(1,1) PRIMARY KEY,
    CustomerID INT,
    AddressLine1 NVARCHAR(200),
    AddressLine2 NVARCHAR(200),
    City NVARCHAR(100),
    State NVARCHAR(100),
    PostalCode NVARCHAR(20),
    CountryCode CHAR(2),
    FOREIGN KEY (CustomerID) REFERENCES dbo.Customers(CustomerID),
    FOREIGN KEY (CountryCode) REFERENCES dbo.Countries(CountryCode)
);

Prompt:
Extend the Countries table with ISO 3 characters code and ISO Number.

Output:

CREATE TABLE dbo.Countries (
    CountryCode CHAR(2) PRIMARY KEY, -- ISO 3166-1 alpha-2 code
    CountryName NVARCHAR(100) NOT NULL,
    ISO3Code CHAR(3), -- ISO 3166-1 alpha-3 code
    ISONumericCode INT -- ISO 3166-1 numeric code
);

Prompt:
Create a Regions table with Region code and description that allows to group the countries in territories. A country can be assigned to multiple regions.

Output:

-- Create the Regions Table:
CREATE TABLE dbo.Regions (
    RegionCode CHAR(3) PRIMARY KEY,
    RegionDescription NVARCHAR(100) NOT NULL
);

-- Create the Countries Table (including the previous columns):
CREATE TABLE dbo.Countries (
    CountryCode CHAR(2) PRIMARY KEY, -- ISO 3166-1 alpha-2 code
    CountryName NVARCHAR(100) NOT NULL,
    ISO3Code CHAR(3), -- ISO 3166-1 alpha-3 code
    ISONumericCode INT -- ISO 3166-1 numeric code
);

-- Create the CountryRegions Junction Table:
CREATE TABLE dbo.CountryRegions (
    CountryCode CHAR(2),
    RegionCode CHAR(3),
    PRIMARY KEY (CountryCode, RegionCode),
    FOREIGN KEY (CountryCode) REFERENCES dbo.Countries(CountryCode),
    FOREIGN KEY (RegionCode) REFERENCES dbo.Regions(RegionCode)
);

Prompt:
Create a view based on Customers that includes all the above tables. 

Output:

CREATE VIEW CustomerDetails AS
SELECT 
    c.CustomerID,
    c.CustomerName,
    c.ContactName,
    a.AddressLine1,
    a.AddressLine2,
    a.City,
    a.State,
    a.PostalCode,
    co.CountryCode,
    co.CountryName,
    co.ISO3Code,
    co.ISONumericCode,
    r.RegionCode,
    r.RegionDescription
FROM 
    dbo.Customers c
LEFT JOIN 
    dbo.Addresses a ON c.CustomerID = a.CustomerID
LEFT JOIN 
    dbo.Countries co ON a.CountryCode = co.CountryCode
LEFT JOIN 
    dbo.CountryRegions cr ON co.CountryCode = cr.CountryCode
LEFT JOIN 
    dbo.Regions r ON cr.RegionCode = r.RegionCode;

Prompt:
In the above view use 3 letter encoding for table's name. 

Output:

CREATE VIEW CustomerDetails AS
SELECT 
    cus.CustomerID,
    cus.CustomerName,
    cus.ContactName,
    add.AddressLine1,
    add.AddressLine2,
    add.City,
    add.State,
    add.PostalCode,
    cou.CountryCode,
    cou.CountryName,
    cou.ISO3Code,
    cou.ISONumericCode,
    reg.RegionCode,
    reg.RegionDescription
FROM 
    dbo.Customers cus
LEFT JOIN 
    dbo.Addresses add ON cus.CustomerID = add.CustomerID
LEFT JOIN 
    dbo.Countries cou ON add.CountryCode = cou.CountryCode
LEFT JOIN 
    dbo.CountryRegions cr ON cou.CountryCode = cr.CountryCode
LEFT JOIN 
    dbo.Regions reg ON cr.RegionCode = reg.RegionCode;

Unfortunately, the definition uses the "add" reserved word for addresses as prefix. Therefore, the following prompt:

Prompt:
Modify the above view to use 3-letter aliases that aren't as reserved words. 

Output:

CREATE VIEW dbo.CustomerDetails AS
SELECT 
    cus.CustomerID, -- Customer ID
    cus.CustomerName, -- Customer name
    cus.ContactName, -- Contact name
    adr.AddressLine1, -- First line of the address
    adr.AddressLine2, -- Second line of the address
    adr.City, -- City of the address
    adr.State, -- State or region of the address
    adr.PostalCode, -- Postal or ZIP code of the address
    cou.CountryCode, -- Country code
    cou.CountryName, -- Country name
    cou.ISO3Code, -- ISO 3166-1 alpha-3 code
    cou.ISONumericCode, -- ISO 3166-1 numeric code
    reg.RegionCode, -- Region code
    reg.RegionDescription -- Region description
FROM 
    dbo.Customers cus
LEFT JOIN 
    dbo.Addresses adr ON cus.CustomerID = adr.CustomerID
LEFT JOIN 
    dbo.Countries cou ON adr.CountryCode = cou.CountryCode
LEFT JOIN 
    dbo.CountryRegions cr ON cou.CountryCode = cr.CountryCode
LEFT JOIN 
    dbo.Regions reg ON cr.RegionCode = reg.RegionCode;

Prompt:

Modify the schema of the above objects to include the schema name.

Prompt:
Change the definition of the above tables and test whether the tables already exists.

Prompt:
Annotate the definition of the above objects with a comment.

Prompt:
Generate the scripts for inserting 10 customers with different addresses in the above tables.

Prompt:
Aggregate the update and insert commands into the same stored procedure.

Output:

CREATE PROCEDURE dbo.UpsertCustomer
    @CustomerID INT = NULL, -- Optional parameter for insert
    @CustomerName NVARCHAR(100),
    @ContactName NVARCHAR(100),
    @Country NVARCHAR(50)
AS
BEGIN
    IF EXISTS (SELECT 1 FROM dbo.Customers WHERE CustomerID = @CustomerID)
    BEGIN
        -- Update existing customer
        UPDATE dbo.Customers
        SET 
            CustomerName = @CustomerName,
            ContactName = @ContactName,
            Country = @Country
        WHERE 
            CustomerID = @CustomerID;
    END
    ELSE
    BEGIN
        -- Insert new customer
        INSERT INTO dbo.Customers (CustomerName, ContactName, Country)
        VALUES (@CustomerName, @ContactName, @Country);
    END
END;

I used to create similar code in the past to consolidate the logic for inserts and updates within the same stored procedure. This allows to reduce the number of objects created in a database and minimize the impact of some other issues (e.g. code duplication for validation).

I also use the comma before each line of text where it applies (and not at the end). However, this can be addressed with the following prompt:

Prompt: 
Rewrite the above database objects by butting the comma in front of each line where it applies.

Prompt:
Modify the view so each join condition is written on its own line.

Prompt:
Modify the view with create or alter command.

Prompt:
Modify the view to check for null values.

Comments:
1) I could use the scripts also in a SQL database, though one must pay attention to the order in which the steps must be run.
2) All these are basic steps; so, a natural question: how far can we go? One can generate further objects, and knowledge about the data model seems to be available in Copilot, though normal data models are much more complex than this! Probably, there's a limit beyond which Copilot will start to be inefficient as long the data model is not available explicitly. SQL Server Copilot would help probably to overcome such limitations, though the feature is still in limited preview.
3) I wonder whether given a set of best practices, Copilot will be able to use them and create the CRUD objects accordingly. More experiments are needed though.
4) It will be interesting to observe how much the prompts generated by other people lead to similar or different outcomes. (I observed that nontechnical prompts generated in the past different outcomes.)
5) The fact that I was able to change some formatting (e.g. comma before each line) for all objects with just a prompt frankly made my day! I can't stress enough how many hours of work the unformatted code required in the past! It will be interesting to check if this can be applied also to a complex database schema.

Happy coding!

Previous Post <<||>> Next Post

References:
[1] Microsoft 365 (2024) Overview of prompts [link]

Resources:
[R1] Microsoft 365 (2025) Microsoft 365 Copilot release notes [link]
[R2] Microsoft 365 (2025) What’s new in Microsoft 365 Copilot | Jan 2025 [link]
[R3] Microsoft 365 (2025) Copilot is now included in Microsoft 365 Personal and Family [link]

Acronyms:
LLM - large language model
CRUD - Create, Read, Update, Delete

23 February 2025

💎🏭SQL Reloaded: Microsoft Fabric's SQL Databases (Part X: Templates for Database Objects)

One of the new features remarked in SQL databases when working on the previous post is the availability of templates in SQL databases. The functionality is useful even if is kept to a minimum. Probably, more value can be obtained when used in combination with Copilot, which requires at least a F12 capacity.

Schemas

Schemas are used to create a logical grouping of objects such as tables, stored procedures, and functions. From a structural and security point of view it makes sense to create additional schemas to manage the various database objects and use the default dbo schema only occasionally (e.g. for global created objects).

-- generated template - schema
CREATE SCHEMA SchemaName

-- create schema
CREATE SCHEMA Test

One can look at the sys.schemas to retrieve all the schemas available:

-- retrieve all schemas
SELECT schema_id
, name
, principal_id
FROM sys.schemas
ORDER BY schema_id

Tables

Tables, as database objects that contain all the data in a database are probably the elements that need the greatest attention in design and data processing. In some cases a table can be dedenormalized and it can store all the data needed, much like in MS Excel, respectively, benormalized in fact and dimension tables. 

Tables can be created explicitly by defining in advance their structure (see Option 1), respectively on the fly (see Option 2). 

-- Option 1
-- create the table manually (alternative to precedent step
CREATE TABLE [Test].[Customers](
	[CustomerId] [int] NOT NULL,
	[AddressID] [int] NULL,
	[Title] [nvarchar](8) NULL,
	[FirstName] [nvarchar](50) NULL,
	[LastName] [nvarchar](50) NULL,
	[CompanyName] [nvarchar](128) NULL,
	[SalesPerson] [nvarchar](256) NULL
) ON [PRIMARY]
GO

-- insert records
INSERT INTO Test.Customers
SELECT CustomerId
, Title
, FirstName 
, LastName
, CompanyName
, SalesPerson
FROM SalesLT.Customer -- checking the output (both scenarios) SELECT top 100 * FROM Test.Customers

One can look at the sys.tables to retrieve all the tables available:

-- retrieve all tables
SELECT schema_name(schema_id) schema_name
, object_id
, name
FROM sys.tables
ORDER BY schema_name
, name

Views

Views are much like virtual table based on the result-set of an SQL statement that combines data from one or multiple tables.  They can be used to encapsulate logic, respectively project horizontally or  vertically a subset of the data. 

-- create view
CREATE OR ALTER VIEW Test.vCustomers
-- Customers 
AS
SELECT CST.CustomerId 
, CST.Title
, CST.FirstName 
, IsNull(CST.MiddleName, '') MiddleName
, CST.LastName 
, CST.CompanyName 
, CST.SalesPerson 
FROM SalesLT.Customer CST

-- test the view 
SELECT *
FROM Test.vCustomers
WHERE CompanyName = 'A Bike Store'

One can look at the sys.views to retrieve all the views available:

-- retrieve all views
SELECT schema_name(schema_id) schema_name
, object_id
, name
FROM sys.views
ORDER BY schema_name
, name

User-Defined Functions

A user-defined function (UDF) allows to create a function by using a SQL expression. It can be used alone or as part of a query, as in the below example.

-- generated template - user defined function
CREATE FUNCTION [dbo].[FunctionName] (
    @param1 INT,
    @param2 INT
)
RETURNS INT AS BEGIN RETURN
    @param1 + @param2
END

-- user-defined function: 
CREATE OR ALTER FUNCTION Test.GetFirstMiddleLastName (
    @FirstName nvarchar(50),
    @MiddleName nvarchar(50),
    @LastName nvarchar(50)
)
RETURNS nvarchar(150) AS 
BEGIN 
   RETURN IsNull(@FirstName, '') + IsNull(' ' + @MiddleName, '') + IsNull(' ' + @LastName, '') 
END

-- test UDF on single values
SELECT Test.GetFirstMiddleLastName ('Jack', NULL, 'Sparrow')
SELECT Test.GetFirstMiddleLastName ('Jack', 'L.', 'Sparrow')

-- test UDF on a whole table
SELECT TOP 100 Test.GetFirstMiddleLastName (FirstName, MiddleName, LastName)
FROM SalesLT.Customer

One can look at the sys.objects to retrieve all the scalar functions available:

-- retrieve all scalar functions
SELECT schema_name(schema_id) schema_name
, name
, object_id
FROM sys.objects 
WHERE type_desc = 'SQL_SCALAR_FUNCTION'
ORDER BY schema_name
, name

However, UDFs prove to be useful when they mix the capabilities of functions with the ones of views allowing to create a "parametrized view" (see next example) or even encapsulate a multi-line statement that returns a dataset. Currently, there seems to be no template available for creating such functions.

-- table-valued function
CREATE OR ALTER FUNCTION Test.tvfGetCustomers (
    @CompanyName nvarchar(50) NULL
)
RETURNS TABLE
-- Customers by Company
AS
RETURN (
	SELECT CST.CustomerId 
	, CST.CompanyName
	, CST.Title
	, IsNull(CST.FirstName, '') + IsNull(' ' + CST.MiddleName, '') + IsNull(' ' + CST.LastName, '') FullName
	, CST.FirstName 
	, CST.MiddleName 
	, CST.LastName 
	FROM SalesLT.Customer CST
	WHERE CST.CompanyName = IsNull(@CompanyName, CST.CompanyName)
);

-- test function for values
SELECT *
FROM Test.tvfGetCustomers ('A Bike Store')
ORDER BY CompanyName
, FullName

-- test function for retrieving all values
SELECT *
FROM Test.tvfGetCustomers (NULL)
ORDER BY CompanyName
, FullName

One can look at the sys.objects to retrieve all the table-valued functions available:

-- retrieve all table-valued functions
SELECT schema_name(schema_id) schema_name
, name
, object_id
FROM sys.objects 
WHERE type_desc = 'SQL_INLINE_TABLE_VALUED_FUNCTION'
ORDER BY schema_name , name

Stored Procedures

A stored procedure is a prepared SQL statement that is stored as a database object and precompiled. Typically, the statements considered in SQL functions can be created also as stored procedure, however the latter doesn't allow to reuse the output directly.

-- get customers by company
CREATE OR ALTER PROCEDURE Test.spGetCustomersByCompany (
    @CompanyName nvarchar(50) NULL
)
AS
BEGIN
	SELECT CST.CustomerId 
	, CST.CompanyName
	, CST.Title
	, IsNull(CST.FirstName, '') + IsNull(' ' + CST.MiddleName, '') + IsNull(' ' + CST.LastName, '') FullName
	, CST.FirstName 
	, CST.MiddleName 
	, CST.LastName 
	FROM SalesLT.Customer CST
	WHERE CST.CompanyName = IsNull(@CompanyName, CST.CompanyName)
	ORDER BY CST.CompanyName
	, FullName
END 

-- test the procedure 
EXEC Test.spGetCustomersByCompany NULL -- all customers
EXEC Test.spGetCustomersByCompany 'A Bike Store' -- individual customer

One can look at the sys.objects to retrieve all the stored procedures available:

-- retrieve all scalar functions
SELECT schema_name(schema_id) schema_name
, name
, object_id
FROM sys.objects 
WHERE type_desc = 'SQL_STORED_PROCEDURE'
ORDER BY schema_name , name

In the end, don't forget to drop the objects created above (note the order of the dependencies):

-- drop function 
DROP FUNCTION IF EXISTS Test.GetFirstMiddleLastName

-- drop function 
DROP FUNCTION IF EXISTS Test.tvfGetCustomers 
-- drop precedure DROP VIEW IF EXISTS Test.Test.spGetCustomersByCompany -- drop view DROP VIEW IF EXISTS Test.vCustomers -- drop schema DROP SCHEMA IF EXISTS Test

Previous Post <<||>> Next Post

References:
[1] Microsoft Learn (2024) Microsoft Fabric: Overview of Copilot in Fabric [link]

💎🏭SQL Reloaded: Microsoft Fabric's SQL Databases (Part IX: From OLTP to OLAP Data Models)

With SQL databases Microsoft brought OLTP to Microsoft Fabric which allows addressing a wider range of requirements, though this involves also some challenges that usually are addressed by the transition from the OLTP to OLAP architectures. Typically, there's an abstraction layer that is built on top of the OLTP data models that allows to address the various OLAP requirements. As soon as OLTP and OLAP models are mixed together, this opens the door to design and data quality issues that have impact on the adoption of solutions by users. Probably, those who worked with MS Access or even MS Excel directly or in combination with SQL Server can still remember the issues they run into.

Ideally, it should be a separation layer between OLTP and the OLAP data. This can be easily achieved in SQL databases by using two different schemas that mimic the interaction between the two types of architectures. So, supposing that the dbo schema from the SalesLT is the data as maintain by the OLTP layer, one can add an additional schema Test in which the OLAP logic is modelled. This scenario is not ideal, though it allows to model the two aspects of the topic considered. The following steps are to be performed in the environment in which the SalesLT database was created. 

Independently in which layer one works, it's ideal to create a set of views that abstracts the logic and ideally simplifies the processing of data. So, in a first step it's recommended to abstract the data from the source by creating a set of views like the one below:

-- drop view (cleaning)
-- DROP VIEW IF EXISTS SalesLT.vCustomerLocations 

-- create view
CREATE VIEW SalesLT.vCustomerLocations
-- Customers with main office
AS
SELECT CST.CustomerId 
, CSA.AddressID
, CST.Title
, CST.FirstName 
, IsNull(CST.MiddleName, '') MiddleName
, CST.LastName 
, CST.CompanyName 
, CST.SalesPerson 
, IsNull(CSA.AddressType, '') AddressType
, IsNull(ADR.City, '') City
, IsNull(ADR.StateProvince, '') StateProvince
, IsNull(ADR.CountryRegion, '') CountryRegion
, IsNull(ADR.PostalCode, '') PostalCode
FROM SalesLT.Customer CST
	 LEFT JOIN SalesLT.CustomerAddress CSA
	   ON CST.CustomerID = CSA.CustomerID
	  AND CSA.AddressType = 'Main Office'
	 	LEFT JOIN SalesLT.Address ADR
		  ON CSA.AddressID = ADR.AddressID

The view uses LEFT instead of FULL joins because this allows more flexibility, respectively identifying the gaps existing between entities (e.g. customers without addresses). In these abstractions, the number of transformations is kept to a minimum to reflect the data as reflected by the source. It may be chosen to minimize the occurrence of NULL values as this simplifies the logic for comparisons (see the use of IsNull).

Once the abstraction from the OLTP layer was built, one can make the data available in the OLAP layer:

-- create schema
CREATE SCHEMA Test

-- dropping the target table (for cleaning)
-- DROP TABLE IF EXISTS Test.CustomerLocations

-- Option 1
-- create the table on the fly
SELECT *
INTO Test.CustomerLocations
FROM SalesLT.vCustomerLocations

-- Option 2
-- create the table manually (alternative to precedent step
CREATE TABLE [Test].[CustomerLocations](
	[CustomerId] [int] NOT NULL,
	[AddressID] [int] NULL,
	[Title] [nvarchar](8) NULL,
	[FirstName] [nvarchar](50) NULL,
	[MiddleName] [nvarchar](50) NULL,
	[LastName] [nvarchar](50) NULL,
	[CompanyName] [nvarchar](128) NULL,
	[SalesPerson] [nvarchar](256) NULL,
	[AddressType] [nvarchar](50) NULL,
	[City] [nvarchar](30) NULL,
	[StateProvince] [nvarchar](50) NULL,
	[CountryRegion] [nvarchar](50) NULL,
	[PostalCode] [nvarchar](15) NULL
) ON [PRIMARY]
GO

-- insert records
INSERT INTO Test.CustomerLocations
SELECT *
FROM SalesLT.vCustomerLocations


-- checking the output (both scenarios)
SELECT top 100 *
FROM Test.CustomerLocations


-- drop the view (for cleaning)
-- DROP VIEW IF EXISTS Test.vCustomerLocations

-- create view
CREATE VIEW Test.vCustomerLocations
-- Customer locations
AS
SELECT CSL.CustomerId 
, CSL.AddressID
, CSL.Title
, CSL.FirstName 
, CSL.MiddleName 
, CSL.LastName 
, Concat(CSL.FirstName, ' ' + CSL.MiddleName, ' ', CSL.LastName) FullName
, CSL.CompanyName 
, CSL.SalesPerson 
, CSL.AddressType
, CSL.City
, CSL.StateProvince
, CSL.CountryRegion 
, CSL.PostalCode
FROM Test.CustomerLocations CSL

-- test the view
SELECT top 100 *
FROM Test.vCustomerLocations

Further on, one can create additional objects as required. Usually, a set of well-designed views is enough, offering the needed flexibility with a minimum of code duplication. In addition, one can build stored procedures and table-valued functions as needed:

-- drop the function (for cleaning)
-- DROP FUNCTION IF EXISTS Test.tvfGetCustomerAddresses

-- generated template - function
CREATE FUNCTION Test.tvfGetCustomerAddresses (
    @CountryRegion nvarchar(50) NULL,
    @StateProvince nvarchar(50) NULL
)
RETURNS TABLE
-- Customers by Country & State province
AS
RETURN (
SELECT CSL.CustomerId 
, CSL.AddressID
, CSL.Title
, CSL.FirstName 
, CSL.MiddleName 
, CSL.LastName 
, CSL.FullName
, CSL.CompanyName 
, CSL.SalesPerson 
, CSL.AddressType 
, CSL.City
, CSL.StateProvince 
, CSL.CountryRegion 
, CSL.PostalCode
FROM Test.vCustomerLocations CSL
WHERE CSL.CountryRegion = IsNull(@CountryRegion, CSL.CountryRegion)
  AND CSL.StateProvince = IsNull(@StateProvince, CSL.StateProvince)
);

-- retrieving all records
SELECT *
FROM Test.tvfGetCustomerAddresses(NULL, NULL)

-- providing parameters
SELECT *
FROM Test.tvfGetCustomerAddresses('United States', 'Utah')

-- filtering on non-parametrized volumns
SELECT *
FROM Test.tvfGetCustomerAddresses('United States', 'Utah')
WHERE City = 'Salt Lake City'



-- drop the procedure (for cleaning)
-- DROP PROCEDURE IF EXISTS Test.spGetCustomerAddresses 

-- generated template - stored procedure
CREATE PROCEDURE Test.spGetCustomerAddresses (
    @CountryRegion nvarchar(50) NULL,
    @StateProvince nvarchar(50) NULL
)
-- Customers by Country & State province
AS
BEGIN
	SELECT CSL.CustomerId 
	, CSL.AddressID
	, CSL.Title
	, CSL.FirstName 
	, CSL.MiddleName 
	, CSL.LastName 
	, CSL.FullName
	, CSL.CompanyName 
	, CSL.SalesPerson 
	, CSL.AddressType 
	, CSL.City
	, CSL.StateProvince 
	, CSL.CountryRegion 
	, CSL.PostalCode
	FROM Test.vCustomerLocations CSL
	WHERE CSL.CountryRegion = IsNull(@CountryRegion, CSL.CountryRegion)
	AND CSL.StateProvince = IsNull(@StateProvince, CSL.StateProvince)
END 

-- retrieving all records
EXEC Test.spGetCustomerAddresses NULL, NULL

-- providing parameters
 EXEC Test.spGetCustomerAddresses 'United States', 'Utah'

These steps can repeated for each entity in scope.

This separation between OLTP and OLAP is usually necessary given that business processes need a certain amount of time until they are correctly reflected as per reporting needs. Otherwise, the gaps can negatively impact the quality of data used for reporting. For some reports these deviation might be acceptable, though there will be probably also (many) exceptions. Independently of the solution used, it's still needed to make sure that the data are appropriate for the processes and reporting. 

If no physical separation is needed between the two types of layers, one can remove the persisted tables from the logic and keep the objects as they are.

Independently of which architecture is chosen, one shouldn't forget to validate one's presumptions in what concerns the data model (e.g. customers without addresses, address types, etc.).

Previous Post <<||>> Next Post

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.