Showing posts with label dependecy. Show all posts
Showing posts with label dependecy. Show all posts

27 March 2026

🤖〽️Prompt Engineering: Copilot Unabridged (Part 150: How AI Could Challenge Confinement - Why Secure Design Matters)

Prompt Engineering Series
Prompt Engineering Series

Prompt: "write a post of 600 words on how AI can escape confinement even from more secure environment"

Introduction

The idea of 'AI confinement' has become one of the most debated topics in modern AI governance. Researchers use the term to describe attempts to restrict an advanced system’s access to the outside world - limiting its inputs, outputs, and operational environment so it cannot cause unintended consequences. But as AI systems grow more capable, the question becomes: Is perfect confinement even possible? And if not, what does that imply for how we design and deploy them?

The short answer is that confinement is extremely difficult, not because AI systems possess agency or desires, but because humans consistently underestimate the complexity of socio‑technical systems. The challenge is less about AI 'escaping' and more about the porousness of the environments we build.

1. The Human Factor: The Weakest Link in Any Secure System

Even the most secure environments rely on human operators - engineers, researchers, auditors, and administrators. History shows that humans routinely:

  • Misconfigure systems
  • Overestimate their own security controls
  • Underestimate the creativity of adversarial behavior
  • Make exceptions 'just this once' for convenience

In AI safety literature, this is often called the operator‑error problem. A system doesn’t need to be superintelligent to exploit it; it only needs to output something that a human misinterprets, misuses, or overtrusts.

This is why researchers emphasize interpretability, transparency, and robust oversight rather than relying solely on containment.

2. The Communication Problem: Outputs Are Never Neutral

Even if an AI is placed in a highly restricted environment, it still produces outputs. Those outputs can influence human behavior - sometimes in subtle ways.

This is known as the information hazard problem. A system doesn’t need to 'escape' in a literal sense; it only needs to produce information that leads a human to take an unintended action. This could be as simple as:

  • A misleading recommendation
  • A misinterpreted pattern
  • A suggestion that seems harmless but triggers a cascade of errors

This is why modern AI governance focuses on alignment, guardrails, and human‑in‑the‑loop design, not just physical or digital isolation.

3. The Complexity Problem: Secure Environments Are Never Perfect

Even highly secure systems - nuclear facilities, financial networks, aerospace control systems - experience breaches, failures, and unexpected interactions. AI confinement inherits all the same challenges:

  • Hidden dependencies
  • Software vulnerabilities
  • Hardware side channels
  • Supply‑chain risks
  • Integration with legacy systems

The more complex the environment, the more likely it is that a subtle interaction creates an unintended pathway. This is not about AI 'plotting' anything; it’s about the inherent unpredictability of large, interconnected systems.

4. The Socio‑Technical Problem: AI Doesn’t Operate in a Vacuum

AI systems are embedded in human institutions, economic incentives, and political pressures. Even if a system is designed to be confined, organizations may:

  • Expand its access over time
  • Integrate it with other tools
  • Reduce restrictions to improve performance
  • Repurpose it for new tasks

This phenomenon - capability creep - is one of the biggest reasons confinement erodes. The system doesn’t escape; we let it out because we want more from it.

 5. The Real Lesson: Confinement Alone Is Not a Safety Strategy

The most important insight from AI safety research is that confinement is not a substitute for:

  • Value alignment
  • Robust evaluation
  • Interpretability
  • Ethical governance
  • Multi‑layered oversight
  • Clear deployment policies

A secure environment is helpful, but it cannot compensate for a system that is poorly aligned or poorly understood.

Final Thought

The idea of AI 'escaping confinement; is less about science fiction and more about the realities of human systems: complexity, incentives, and fallibility. The real challenge is not preventing escape - it’s ensuring that the systems we build behave predictably, transparently, and in alignment with human values, regardless of where they operate.

Disclaimer: The whole text was generated by Copilot (under Windows 11) at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

Previous Post <<||>> Next Post

13 January 2021

💎💫SQL Reloaded: Useful Queries for Dynamics 365 F&O and not only (Part I: Dependencies)

Microsoft Dynamics 365 for Finance and Operations (D365 FO or simply D365) has currently more than 12000 tables, therefore attempting to understand the data model behind might seem at first sight to be a foolhardy effort. Fortunately, only a small percentage from all the tables contain meaningful master or transactional data, fact that makes easier the task. 

Moreover, the tables have meaningful prefixes (e.g. 'Invent' for Inventory, 'Purch' for Purchae Orders, 'Sales' for Sales Orders, etc.) and postfixes (e.g. 'Table for header data, 'Line' for position data, 'Trans' for transaction data) while their names are in many cases self-explanatory if one knows the corresponding business terms.

For those coming from earlier versions of D365 (e.g. Microsoft Dynamics AX 2009), with a few exceptions in the inventory transactions, product master data, addresses and finance dimensions, the data structures remained almost the same, if one considers the main functional area of the ERP system. However as new modules and/or functionality were added, the number of new tables increased considerably between the versions. 

On the other side in D365 the data are exposed through data entities, abstractions from the physical implementation of the database tables [1], implemented as views typically having the 'Entity' postfix. In many scenarios the data entities more likely will prove to be the best way of accessing the data, as they include an important part of the logic one will need anyway to reengineer, however there are maybe also exceptions in which is needed to work with the base tables. In the end it will be needed to know the main data entities as well the important tables related to them, being needed to check which views reference a given table, respectively which tables are references by a view. This can be easily achieved via the sys.sql_expression_dependencies system catalog view. 

The following queries can be run in non-production D365 environments:

-- retrieving the views where an object was used 
SELECT SCHEMA_NAME(obj.SCHEMA_ID) + '.'+ obj.name AS referencing_object
, obj.type_desc referencing_object_type
, sed.referenced_schema_name + '.'+ sed.referenced_entity_name AS referenced_object
, obr.type_desc referenced_object_type
FROM sys.all_objects obj
     JOIN sys.sql_expression_dependencies sed 
       ON obj.OBJECT_ID = sed.REFERENCING_ID 
	      JOIN sys.all_objects obr
		    ON sed.referenced_id = obr.object_id
WHERE sed.referenced_entity_name = 'PurchTable'
  AND obj.Type = 'V'
  AND obj.name LIKE 'Purch%'
ORDER BY referencing_object

Output (12 records):
ReferencingObject type_desc ReferencedObject ReferencedObjectType
dbo.PURCHASEORDERCUBE VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHASEORDERRESPONSEHEADERSUMMARY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHLINEBACKORDER VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHLINEBACKORDERWORKSPACE VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHLINEEXPANDED VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHORDERAPPROVEDVIEW VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPREPAYOPENBASE VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERCHARGEENTITY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERENTITY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERLINECDSENTITY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHTABLENOTARCHIVEDVERSIONS VIEW dbo.PURCHTABLE USER_TABLE

The inverse search: 

-- retrieving the objects referenced in a view
SELECT SCHEMA_NAME(obj.SCHEMA_ID) + '.'+ obj.name AS referencing_object
, obj.type_desc referncing_object_type
, sed.referenced_schema_name + '.'+ sed.referenced_entity_name AS referenced_object
, obr.type_desc referenced_object_type
FROM sys.all_objects obj
     JOIN sys.sql_expression_dependencies sed 
       ON obj.OBJECT_ID = sed.referencing_id 
	      JOIN sys.all_objects obr
		    ON sed.referenced_id = obr.object_id
WHERE obj.name = 'PURCHPURCHASEORDERHEADERV2ENTITY'
ORDER BY referenced_object

Output:
ReferencingObject ReferncingObjectType ReferencedObject ReferencedObjectType
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.ACCOUNTINGDISTRIBUTIONTEMPLATE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.CONFIRMINGPO USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.DIMENSIONSETENTITY VIEW
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.GetValidFromInContextInfo SQL_SCALAR_FUNCTION
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.GetValidToInContextInfo SQL_SCALAR_FUNCTION
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.HCMWORKER USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.LOGISTICSPOSTALADDRESS USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.LOGISTICSPOSTALADDRESSBASEENTITY VIEW
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.PURCHTABLE_W USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.REASONTABLEREF USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.TMSPURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.TRANSPORTATIONDOCUMENT USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.VENDINVOICEDECLARATION_IS USER_TABLE

Notes:
1. There will be cases in which is needed to check views' definition. To understand the logic it will be needed to format the queries in a more readable form. This can take time, given that the joins and the further constraints are not always straightforward. Therefore, it might be a good idea to export a formatted copy of the view for later use.
2.  Unfortunately, the data model for D365 is not publicly available. One can use however the data model for AX 2012 (see [2] [3]) as basis and fill the gaps when needed. 
3. The important changes between the various data models versions are partially documented in several documents available via the Microsoft Dynamics CustomerSource. 
4. For non-production environments the data from views, tables or other database objects can be exported directly from the database. Unfortunately, Microsoft blocked the direct access to production environments. The data from tables can be accessed and exported via the table browser, while the data from entities can be exported to Excel typically from the display forms or via the Data Management Framework.
5. The queries can be used to explore any SQL Server database. The logic can be easily encapsulated in a view to simplify the calls.  
6. To minimize the use of redundant code, it's a good idea to encapsulate the logic within a view, when the database allows it.

-- cleaning after
--DROP VIEW admin.vobject_dependencies

-- view definition
CREATE OR ALTER VIEW admin.vobject_dependencies
-- retrieving the dependencies between objects
AS
SELECT SCHEMA_NAME(obj.SCHEMA_ID) + '.'+ obj.name AS referencing_object
, SCHEMA_NAME(obj.SCHEMA_ID) referencing_objecT_schema
, obj.name AS referencing_object_name 
, obj.type_desc referencing_object_type
, sed.referenced_schema_name + '.'+ sed.referenced_entity_name AS referenced_object
, sed.referenced_schema_name AS referenced_object_schema
, sed.referenced_entity_name AS referenced_object_name
, obr.type_desc referenced_object_type
FROM sys.all_objects obj
     JOIN sys.sql_expression_dependencies sed 
       ON obj.OBJECT_ID = sed.REFERENCING_ID 
	  JOIN sys.all_objects obr
            ON sed.referenced_id = obr.object_id

-- call based do referenced object 
SELECT *
FROM  admin.vobject_dependencies
WHERE referenced_object_name = 'PurchTable'
ORDER BY referencing_object_name

-- call based on referencing object 
SELECT *
FROM admin.vobject_dependencies
WHERE referencing_object_name = 'PURCHPURCHASEORDERHEADERV2ENTITY'
ORDER BY referenced_object_name

Previous Post <<||>> Next Post

References:
[1] Microsoft Dynamics 365 (2020) Data entities overview [source]
[2] Microsoft Docs (2014) Application tables [source]
[3] Microsoft Docs (2014) Base Enums AX 2012 [source]

06 July 2020

🪄SSRS (& Paginated Reports): Design (Part II: Parameter Dependencies in Dropdowns)

Introduction

Quite often there are dependencies between the values of two distinct dropdowns, typically the values representing the different levels of a direct on implied hierarchy. This means that the choice at the upper level determines the values which will be shown at a lower level, the corresponding query needing to handle the value as a parameter. There are several ways on how one can handle this, the choice depending also on the data provider considered for the report.

Because the AdventureWorks database is used within an SQL Server, the available choices are MSOLEDBSQLSQLOLEDB, ODBC or SQL Server Native Client. It should be noted that the SQLOLEDB and the Native Clients providers have been deprecated and shouldn't be used for new developments! In the project was used the Microsoft OLE DB Provider for SQL Server (MSOLEDBSQL) provider.

For exemplification will be used the reports build previously. The hierarchy will be built based on the State and Territory, a Country belonging to a Territory, while a State to a Country. For the beginning we will build the dependency existing between the Country and State. 

First Level of Dependency 

The text parameter for StateCountry dataset will be defined similarly to the one consider for the Country, and have as source the following query:

-- State Provinces
SELECT '' Id
, '(All)' Name
UNION ALL
SELECT StateProvinceCode
, StateProvinceCode + ' ' + Name
FROM [Person].[StateProvince]
WHERE CountryRegionCode = ISNULL(NullIf(?, ''), CountryRegionCode)
ORDER BY Id

The Country parameter will be handled in the WHERE constraint - it's the role of the question mark. When an empty string was selected, the equivalent for ignoring the parameter, the formula used will result to an identity, which however makes sense only if there's always a value for the Country Code in the Person.StateProvince table. Otherwise the same line needs to be written as follows:

WHERE IsNull(CountryRegionCode, '') = COALESCE(NullIf(?, ''), CountryRegionCode, '')

This might look strange, though it does the trick of handling the nulls. According to the rules of defensive programming, one should treat the eventuality of nulls. However, this depends also on application's design. If one develops reports with this supposition, then the supposition should be made explicit in the documentation!

Once the query provided, SSRS recognizes that a parameter was provided and requires to map the question mark to the respective parameter. Within dataset's properties click on the "Parameter" tab and selected @Country as Parameter Value:

Parameters in Dataset Properties

A question mark needs to be considered in the query for each parameter, the parameters from the Dataset Properties being displayed in the order in which they appear in the query.

Once the dataset defined, one can go on and define the StateProvince parameter similarly to the Country parameter and consider the StateProvinces dataset as source. The StateProvince parameter needs to be added also in report's header:
= "State Province: " & Parameters!StateProvince.Label

Within the Addresses main dataset there needs to be added the declaration for the StateProvince parameter, while in where the corresponding clause:

-- SQL Parameter
& " DECLARE @StateProvince as nvarchar(3) = '" & Parameters!StateProvince.Value & "'" & vbcrlf 

-- constraint in WHERE clause
& IIf(Parameters!StateProvince.Value<> "", "     AND SIC.StateProvinceCode = @StateProvince", "") & vbcrlf W

With these changes the report should be ready to run.

Notes:
The above query for the StateProvince dataset can be used inside of a formula with a few small changes. This should work independently of the provider used. Just replace the query in the formula and delete the parameters previously created for the data source via the Dataset Properties:

= " -- State Provinces " & vbCrLf
 & " DECLARE @Country as nvarchar(3) = '" & Parameters!Country.Value & "'" & vbcrlf 
 & " SELECT '' Id " & vbCrLf
 & " , '(All)' Name " & vbCrLf
 & " UNION ALL " & vbCrLf
 & " SELECT StateProvinceCode " & vbCrLf
 & " , StateProvinceCode + ' ' + Name " & vbCrLf
 & " FROM [Person].[StateProvince] " & vbCrLf
 & IIf(Parameters!Country.Value<>"", "     WHERE CountryRegionCode = @Country", "") & vbcrlf 
 & " ORDER BY Id " & vbCrLf

Second Level of Dependency

The second dependency is based on the Territory Group formed of North America, Pacific and Europe. There is however no Territory Group table, the relations between Countries and Territory Groups being stored into the Sales.SalesTerritory table. Therefore, the query for the corresponding dropdown can be written via a DISTINCT:

-- Territories
SELECT '' Id
, '(All)' Name
UNION ALL
SELECT DISTINCT [Group] Id
, [Group] Name
FROM Sales.SalesTerritory
ORDER BY Id

When the user selects a Territory, in the Countries and State Province dropdowns are supposed to be shown only the corresponding values for the selected value. We need to modify thus both queries via formulas.
The query for the Country dataset includes the Territory parameter and the corresponding constraint via an EXIST:

= "  -- Countries  " & vbCrLf 
& " DECLARE @Territory as nvarchar(50) = '" & Parameters!Territory.Value & "'" & vbcrlf 
& " SELECT '' Id  " & vbCrLf 
& " , '(All)' Name  " & vbCrLf 
& " UNION ALL  " & vbCrLf 
& " SELECT SSP.CountryRegionCode Id  " & vbCrLf 
& " , SSP.Name   " & vbCrLf 
& " FROM [Person].[CountryRegion] SSP " & vbCrLf 
& IIf(Parameters!Territory.Value<> "", "     WHERE EXISTS (SELECT * FROM [Sales].[SalesTerritory] SST WHERE SSP.CountryRegionCode = SST.CountryRegionCode AND SST.[Group] = @Territory)", "") & vbcrlf 
& " ORDER BY Id " & vbCrLf

In exchange the StateProvince dataset is slightly more complex:

= " -- State Provinces " & vbCrLf
 & " DECLARE @Country as nvarchar(3) = '" & Parameters!Country.Value & "'" & vbcrlf 
 & " DECLARE @Territory as nvarchar(50) = '" & Parameters!Territory.Value & "'" & vbcrlf 
 & " SELECT '' Id " & vbCrLf
 & " , '(All)' Name " & vbCrLf
 & " UNION ALL " & vbCrLf
 & " SELECT PSP.StateProvinceCode " & vbCrLf
 & " , PSP.StateProvinceCode + ' ' + PSP.Name " & vbCrLf
 & " FROM [Person].[StateProvince] PSP" & vbCrLf
 & "      JOIN [Sales].[SalesTerritory] PST" & vbCrLf
 & "       ON PSP.TerritoryId = PST.TerritoryId " & vbCrLf
 & " WHERE 0=0 " & vbCrLf
 & IIf(Parameters!Country.Value<> "", "     AND PSP.CountryRegionCode = @Country", "") & vbcrlf 
 & IIf(Parameters!Territory.Value<> "", "     AND PST.[Group] = @Territory", "") & vbcrlf 
 & " ORDER BY Id " & vbCrLf

Once the datasets changed, is still needed to modify the order of the parameters to reflect the order in which they are called within dependency, respectively Territory, State Province, Country. In the end the report looks like this:


Notes:
(1) If the order of the parameters doesn't reflect the otder in which they are called, you'll get a generic error similar with the following one (the error appears when there are problems with the query):
An error occurred during local report processing. The definition of the report '/Customer Addresses' is invalid. The CommandText expression for the query 'Countries' contains an error: [BC30277] Type character '8e does not match declared data type 'String'.
(2) When designing parameters one should consider also the best practices for avoiding the poor design of parameters.

Previous Post <<||>> Next Post

24 February 2018

💎SQL Reloaded: Misusing Views and Pseudo-Constants

Views as virtual tables can be misused to replace tables in certain circumstances, either by storing values within one or multiple rows, like in the below examples:

-- parameters for a BI solution
CREATE VIEW dbo.vLoV_Parameters
AS
SELECT Cast('ABC' as nvarchar(20)) AS DataAreaId
 , Cast(GetDate() as Date) AS CurrentDate 
 , Cast(100 as int) AS BatchCount 

GO

SELECT *
FROM dbo.vLoV_Parameters

GO

-- values for a dropdown 
CREATE VIEW dbo.vLoV_DataAreas
AS
SELECT Cast('ABC' as nvarchar(20)) AS DataAreaId
, Cast('Company ABC' as nvarchar(50)) AS Description 
UNION ALL
SELECT 'XYZ' DataAreaId 
, 'Company XYZ'

GO

SELECT *
FROM dbo.vLoV_DataAreas

GO

These solutions aren’t elegant, and typically not recommended because they go against one of the principles of good database design, namely “data belong in tables”, though they do the trick when needed. Personally, I used them only in a handful of cases, e.g. when it wasn’t allowed to create tables, when it was needed testing something for a short period of time, or when there was some overhead of creating a table for 2-3 values. Because of their scarce use, I haven’t given them too much thought, not until I discovered Jared Ko’s blog posting on pseudo-constants. He considers the values from the first view as pseudo-constants, and advocates for their use especially for easier dependency tracking, easier code refactoring, avoiding implicit data conversion and easier maintenance of values.

All these are good reasons to consider them, therefore I tried to take further the idea to see if it survives a reality check. For this I took Dynamics AX as testing environment, as it makes extensive use of enumerations (aka base enums) to store list of values needed allover through the application. Behind each table there are one or more enumerations, the tables storing master data abounding of them.  For exemplification let’s consider InventTrans, table that stores the inventory transactions, the logic that governs the receipt and issued transactions are governed by three enumerations: StatusIssue, StatusReceipt and Direction.

-- Status Issue Enumeration 
 CREATE VIEW dbo.vLoV_StatusIssue
 AS
 SELECT cast(0 as int) AS None
 , cast(1 as int) AS Sold
 , cast(2 as int) AS Deducted
 , cast(3 as int) AS Picked
 , cast(4 as int) AS ReservPhysical
 , cast(5 as int) AS ReservOrdered
 , cast(6 as int) AS OnOrder
 , cast(7 as int) AS QuotationIssue

GO

-- Status Receipt Enumeration 
 CREATE VIEW dbo.vLoV_StatusReceipt
 AS
SELECT cast(0 as int) AS None
 , cast(1 as int) AS Purchased
 , cast(2 as int) AS Received
 , cast(3 as int) AS Registered
 , cast(4 as int) AS Arrived
 , cast(5 as int) AS Ordered
 , cast(6 as int) AS QuotationReceipt

GO

-- Inventory Direction Enumeration 
 CREATE VIEW dbo.vLoV_InventDirection
 AS
 SELECT cast(0 as int) AS None
 , cast(1 as int) AS Receipt
 , cast(2 as int) AS Issue

To see these views at work let’s construct the InventTrans table on the fly:

-- creating an ad-hoc table  
SELECT *
INTO  dbo.InventTrans
FROM (VALUES (1, 1, 0, 2, -1, 'A0001')
, (2, 1, 0, 2, -10, 'A0002')
, (3, 2, 0, 2, -6, 'A0001')
, (4, 2, 0, 2, -3, 'A0002')
, (5, 3, 0, 2, -2, 'A0001')
, (6, 1, 0, 1, 1, 'A0001')
, (7, 0, 1, 1, 50, 'A0001')
, (8, 0, 2, 1, 100, 'A0002')
, (9, 0, 3, 1, 30, 'A0003')
, (10, 0, 3, 1, 20, 'A0004')
, (11, 0, 1, 2, 10, 'A0001')
) A(TransId, StatusIssue, StatusReceipt, Direction, Qty, ItemId)

 Here are two sets of examples using literals vs. pseudo-constants:

--example issued with literals 
SELECT top 100 ITR.*
FROM dbo.InventTrans ITR
WHERE ITR.StatusIssue = 1 
  AND ITR.Direction = 2

GO
--example issued with pseudo-constants
SELECT top 100 ITR.*
FROM dbo.InventTrans ITR
      JOIN dbo.vLoV_StatusIssue SI
        ON ITR.StatusIssue = SI.Sold
      JOIN dbo.vLoV_InventDirection ID
        ON ITR.Direction = ID.Issue

GO

--example receipt with literals 
SELECT top 100 ITR.*
FROM dbo.InventTrans ITR
WHERE ITR.StatusReceipt= 1
   AND ITR.Direction = 1

GO

--example receipt with pseudo-constants
SELECT top 100 ITR.*
FROM dbo.InventTrans ITR
      JOIN dbo.vLoV_StatusReceipt SR
        ON ITR.StatusReceipt= SR.Purchased
      JOIN dbo.vLoV_InventDirection ID
        ON ITR.Direction = ID.Receipt

As can be seen the queries using pseudo-constants make the code somehow readable, though the gain is only relative, each enumeration implying an additional join. In addition, when further business tables are added to the logic (e.g. items, purchases or sales orders)  it complicates the logic, making it more difficult to separate the essential from nonessential. Imagine a translation of the following query:

-- complex query 
SELECT top 100 ITR.*
FROM dbo.InventTrans ITR
              <several tables here>
WHERE ((ITR.StatusReceipt<=3 AND ITR.Direction = 1)
  OR (ITR.StatusIssue<=3 AND ITR.Direction = 2))
  AND (<more constraints here>)


The more difficult the constraints in the WHERE clause, the more improbable is a translation of the literals into pseudo-constraints. Considering that an average query contains 5-10 tables, each of them with 1-3 enumerations, the queries would become impracticable by using pseudo-constants and quite difficult to troubleshoot their execution plans.

The more I’m thinking about, an enumeration data type as global variable in SQL Server (like the ones available in VB) would be more than welcome, especially because values are used over and over again through the queries. Imagine, for example, the possibility of writing code as follows:

-- hypothetical query
SELECT top 100 ITR.*
FROM dbo.InventTrans ITR
WHERE ITR.StatusReceipt = @@StatusReceipt .Purchased
  AND ITR.Direction = @@InventDirection.Receipt

From my point of view this would make the code more readable and easier to maintain. Instead, in order to make the code more readable, one’s usually forced to add some comments in the code. This works as well, though the code can become full of comments.

-- query with commented literals
SELECT top 100 ITR.*
FROM dbo.InventTrans ITR
WHERE ITR.StatusReceipt <=3 -- Purchased, Received, Registered 
   AND ITR.Direction = 1-- Receip

In conclusion, pseudo-constants’ usefulness is only limited, and their usage is  against developers’ common sense, however a data type in SQL Server with similar functionality would make code more readable and easier to maintain.

Notes:
1) It is possible to simulate an enumeration data type in tables’ definition by using a CHECK constraint.
2) The queries work also in SQL databases in Microsoft Fabric (see file in GitHub repository). You might want to use another schema (e.g. Test), not to interfere with the existing code. 

Happy coding!

16 January 2017

⛏️Data Management: Data Flow (Definitions)

"The sequence in which data transfer, use, and transformation are performed during the execution of a computer program."  (IEEE," IEEE Standard Glossary of Software Engineering Terminology", 1990)

"A component of a SQL Server Integration Services package that controls the flow of data within the package." (Marilyn Miller-White et al, "MCITP Administrator: Microsoft® SQL Server™ 2005 Optimization and Maintenance 70-444", 2007)

"Activities of a business process may exchange data during the execution of the process. The data flow graph of the process connects activities that exchange data and - in some notations - may also represent which input/output parameters of the activities are involved." (Cesare Pautasso, "Compiling Business Process Models into Executable Code", 2009)

"Data dependency and data movement between process steps to ensure that required data is available to a process step at execution time." (Christoph Bussler, "B2B and EAI with Business Process Management", 2009)

[logical data flow:] "A data flow diagram that describes the flow of information in an enterprise without regard to any mechanisms that might be required to support that flow." (David C Hay, "Data Model Patterns: A Metadata Map", 2010)

[physical data flow:] "A data flow diagram that identifies and represents data flows and processes in terms of the mechanisms currently used to carry them out." (David C Hay, "Data Model Patterns: A Metadata Map", 2010)

"The fact that data, in the form of a virtual entity class, can be sent from a party, position, external entity, or system process to a party, position, external entity, or system process." (David C Hay, "Data Model Patterns: A Metadata Map", 2010)

"An abstract representation of the sequence and possible changes of the state of data objects, where the state of an object is any of: creation, usage, or destruction [Beizer]." (International Qualifications Board for Business Analysis, "Standard glossary of terms used in Software Engineering", 2011)

"Data flow refers to the movement of data from one purpose to another; also the movement of data through a set of systems, or through a set of transformations within one system; it is a nontechnical description of how data is processed. See also Data Chain." (Laura Sebastian-Coleman, "Measuring Data Quality for Ongoing Improvement ", 2012)

"The movement of data through a group of connected elements that extract, transform, and load data." (Microsoft, "SQL Server 2012 Glossary", 2012)

"A path that carries packets of information of known composition; a roadway for data. Every data flow’s composition is recorded in the data dictionary." (James Robertson et al, "Complete Systems Analysis: The Workbook, the Textbook, the Answers", 2013)

"the path, in information systems or otherwise, through which data move during the active phase of a study." (Meredith Zozus, "The Data Book: Collection and Management of Research Data", 2017)

"The lifecycle movement and storage of data assets along business process networks, including creation and collection from external sources, movement within and between internal business units, and departure through disposal, archiving, or as products or other outputs." (Kevin J Sweeney, "Re-Imagining Data Governance", 2018)

"A graphical model that defines activities that extract data from flat files or relational tables, transform the data, and load it into a data warehouse, data mart, or staging table." (Sybase, "Open Server Server-Library/C Reference Manual", 2019)

"An abstract representation of the sequence and possible changes of the state of data objects, where the state of an object is any of: creation, usage, or destruction." (Software Quality Assurance)

14 August 2011

📈Graphical Representation: Data Flow Diagram (Definitions)

"A diagram that shows the data flows in an organization, including sources of data, where data are stored, and processes that transform data." (Jan L Harrington, "Relational Database Dessign: Clearly Explained" 2nd Ed., 2002)

"A diagram of the data flow from sources through processes and files to users. A source or user is represented by a square; a data file is represented by rectangles with missing righthand edges; a process is represented by a circle or rounded rectangle; and a data flow is represented by an arrow." (Jens Mende, "Data Flow Diagram Use to Plan Empirical Research Projects", 2009)

"A diagram used in functional analysis which specifies the functions of the system, the inputs/outputs from/to external (user) entities, and the data being retrieved from or updating data stores. There are well-defined rules for specifying correct DFDs, as well as for creating hierarchies of interrelated DFDs." (Peretz Shoval & Judith Kabeli, "Functional and Object-Oriented Methodology for Analysis and Design", 2009)

[Control Data Flow Graph (CDFG):] " Represents the control flow and the data dependencies in a program." (Alexander Dreweke et al, "Text Mining in Program Code", 2009)

"A graphic method for documenting the flow of data within an organization." (Jan L Harrington, "Relational Database Design and Implementation: Clearly explained" 3rd Ed., 2009)

"A graphic representation of the interactions between different processes in an organization in terms of data flow communications among them. This may be a physical data flow diagram that describes processes and flows in terms of the mechanisms involved, a logical data flow diagram that is without any representation of the mechansm, or an essential data flow diagram that is a logical data flow diagram organized in terms of the processes that respond to each external event." (David C Hay, "Data Model Patterns: A Metadata Map", 2010)

"Data-flow diagrams (DFDs) are system models that show a functional perspective where each transformation represents a single function or process. DFDs are used to show how data flows through a sequence of processing steps." (Ian Sommerville, "Software Engineering" 9th Ed., 2011)

"A model of the system that shows the system’s processes, the data that flow between them (hence the name), and the data stores used by the processes. The data flow diagram shows the system as a network of processes, and is thought to be the most easily recognized of all the analysis models." (James Robertson et al, "Complete Systems Analysis: The Workbook, the Textbook, the Answers", 2013)

"A picture of the movement of data between external entities and the processes and data stores within a system." (Jeffrey A Hoffer et al, "Modern Systems Analysis and Design" 7th Ed., 2014)

"A schematic indicating the direction of the movement of data" (Daniel Linstedt & W H Inmon, "Data Architecture: A Primer for the Data Scientist", 2014)

"A Data Flow Diagram (DFD) is a graphical representation of the 'flow' of data through an information system, modeling its process aspects. Often it is a preliminary step used to create an overview of the system that can later be elaborated." (Henrikvon Scheel et al, "Process Concept Evolution", 2015)

"Data flow maps are tools that graphically represent the results of a comprehensive data assessment to illustrate what information comes into an organization, for what purposes that information is used, and who has access to that information." (James R Kalyvas & Michael R Overly, "Big Data: A Business and Legal Guide", 2015)

"A graphical representation of the logical or conceptual movement of data within an existing or planned system." (George Tillmann, "Usage-Driven Database Design: From Logical Data Modeling through Physical Schmea Definition", 2017)

"a visual depiction using standard symbols and conventions of the sources of, movement of, operations on, and storage of data." (Meredith Zozus, "The Data Book: Collection and Management of Research Data", 2017)

"A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself." (Wikipedia) [source]

"A graphical representation of the sequence and possible changes of the state of data objects, where the state of an object is any of: creation, usage, or destruction." (IQBBA)

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 25 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.