Showing posts with label automation. Show all posts
Showing posts with label automation. Show all posts

01 January 2025

💎🏭SQL Reloaded: Microsoft Fabric's SQL Databases (Part IV: Automatic Tuning Options) [new feature]

Automatic tuning in SQL databases, respectively in SQL Azure datsbases, is a fully managed performance service that uses built-in intelligence to continuously monitor queries executed and automatically improve their performance [1]. At least in SQL databases the target is to fully automate indexes' maintenance by providing automated tuning, verification of performance gains, rollback and self-correction, respectively tuning history. The future sounds promising, though the question is what's already available. 

The documentation references several objects made available already for this feature. sys.database_automatic_tuning_options returns the tuning options available for the database:

-- SQL databases - automatic tuning options
SELECT ATO.name
, ATO.desired_state
, ATO.desired_state_desc
, ATO.actual_state
, ATO.actual_state_desc
, ATO.reason
, ATO.reason_desc
FROM sys.database_automatic_tuning_options ATO
ORDER BY ATO.name
Output:
name desired_state desired_state_desc actual_state actual_state_desc reason reason_desc
CREATE_INDEX 2 DEFAULT 1 ON 3 INHERITED_FROM_SERVER
DROP_INDEX 2 DEFAULT 1 ON 3 INHERITED_FROM_SERVER
FORCE_LAST_GOOD_PLAN 2 DEFAULT 1 ON 3 INHERITED_FROM_SERVER
MAINTAIN_INDEX 2 DEFAULT 0 OFF 3 INHERITED_FROM_SERVER

For further information see the automatic tuning options in the documentation [2], respectively [1] for an overview of the feature. 

There's a sys.database_automatic_tuning_mode and a sys.database_automatic_tuning_configurations, though on the SQL database instance only the first has records.

-- SQL databases - automatic tuning mode
SELECT ATM.desired_state
, ATM.desired_state_desc
, ATM.actual_state
, ATM.actual_state_desc
FROM sys.database_automatic_tuning_mode ATM
Output:
desired_state desired_state_desc actual_state actual_state_desc
1 INHERIT 3 AUTO

Attempting to modify the above settings at database level via the ALTER DATABASE leads to the following error message: 

-- disabling database properties
ALTER DATABASE [AdventureWorks01]
SET AUTOMATIC_TUNING ( FORCE_LAST_GOOD_PLAN = OFF);
Output:
"Msg 16202, Level 16, State 162, Line 1, Keyword or statement option 'SET' is not supported on the 'Microsoft Fabric' platform."

At least for the moment there seems to be no features available to change these settings. 

The automatic index tab report trom the Performance Dashboard for SQL database shows the history and status of automatically created indexes [5]. See Home toolbar in the Query Editor window >> Performance summary >> Automatic indexes. 

What it's not clear is how the database engine balances between index coverage and performance. How long does it take until the engine identifies the first missing index scenario and index's creation, respectively between the drop of an unused index and reoccurence of the same scenario that lead to index's creation. Moreover, what happens with index fragmentation? The documentation doesn't seem to provide the answers that probably only the hand-on experience can provide. Even if AI-based features are used in indexes' maintenance, it's still hard to grasp what lies beyond the various features. 

Are DBAs comfortable enough to relinquish control over index maintenence? It will be interesting to see their feedback. Probably, more control over what the engine does is needed, as sometimes it's enough to have 2-3 major exceptions for a solution to become not feasible. Usually the devil lies in details. 

Migrating from SQL Server or Azure SQL to SQL databases requires some degree of reengineering, probably with more effort and redesign for the first scenario, given the functionality gap. 

Taking a look at the indexes already available in AdventureWorks database, there seem no new indexes created since database's creation (see the definition of the referenced object):

-- sys metadata - index columns
SELECt IND.db_name
, IND.schema_name
, IND.table_name
, IND.index_name
, IND.index_type
, IND.principal_type
, IND.auto_created
FROM meta.vIndexes IND
WHERE IND.schema_name = 'SalesLT'
  AND IND.auto_created = 1
ORDER BY IND.table_name
, IND.index_name

Stay tuned! More tests on the way... 

Happy coding! 

Previous Post <<||>> Next Post

References:
[1] Microsoft Learn (2024) Automatic tuning [link]
[2] Microsoft Learn (2024) Performance Dashboard for SQL database in Microsoft Fabric [link]
[3] Microsoft Learn (2024) ALTER DATABASE SET options (Transact-SQL) [link]
[4] Microsoft Learn (2024) Enable automatic tuning in the Azure portal to monitor queries and improve workload performance [link]

01 September 2024

🗄️Data Management: Data Governance (Part I: No Guild of Heroes)

Data Management Series
Data Management Series

Data governance appeared around 1980s as topic though it gained popularity in early 2000s [1]. Twenty years later, organizations still miss the mark, respectively fail to understand and implement it in a consistent manner. As usual, the reasons for failure are multiple and they vary from misunderstanding what governance is all about to poor implementation of methodologies and inadequate management or leadership. 

Moreover, methodologies tend to idealize the various aspects and is not what organizations need, but pragmatism. For example, data governance is not about heroes and heroism [2], which can give the impression that heroic actions are involved and is not the case! Actions for the sake of action don’t necessarily lead to change by themselves. Organizations are in general good at creating meaningless action without results, especially when people preoccupy themselves, miss or ignore the mark. Big organizations are very good at generating actions without effects. 

People do talk to each other, though they try to solve their own problems and optimize their own areas without necessarily thinking about the bigger picture. The problem is not necessarily communication or the lack of depth into business issues, people do communicate, know the issues without a business impact assessment. The challenge is usually in convincing the upper management that the effort needs to be consolidated, supported, respectively the needed resources made available. 

Probably, one of the issues with data governance is the attempt of creating another structure in the organization focused on quality, which has the chances to fail, and unfortunately does fail. Many issues appear when the structure gains weight and it becomes a separate entity instead of being the backbone of organizations. 

As soon organizations separate the data governance from the key users, management and the other important decisional people in the organization, it takes a life of its own that has the chances to diverge from the initial construct. Then, organizations need "alignment" and probably other big words to coordinate the effort. Also such constructs can work but they are suboptimal because the forces will always pull in different directions.

Making each manager and the upper management responsible for governance is probably the way to go, though they’ll need the time for it. In theory, this can be achieved when many of the issues are solved at the lower level, when automation and further aspects allow them to supervise things, rather than hiding behind every issue. 

When too much mircomanagement is involved, people tend to busy themselves with topics rather than solve the issues they are confronted with. The actual actors need to be empowered to take decisions and optimize their work when needed. Kaizen, the philosophy of continuous improvement, proved itself that it works when applied correctly. They’ll need the knowledge, skills, time and support to do it though. One of the dangers is however that this becomes a full-time responsibility, which tends to create a separate entity again.

The challenge for organizations lies probably in the friction between where they are and what they must do to move forward toward the various objectives. Moving in small rapid steps is probably the way to go, though each person must be aware when something doesn’t work as expected and react. That’s probably the most important aspect. 

So, the more functions are created that diverge from the actual organization, the higher the chances for failure. Unfortunately, failure is visible in the later phases, and thus self-awareness, self-control and other similar “qualities” are needed, like small actors that keep the system in check and react whenever is needed. Ideally, the employees are the best resources to react whenever something doesn’t work as per design. 

Previous Post <<||>> Next Post 

Resources:
[1] Wikipedia (2023) Data Management [link]
[2] Tiankai Feng (2023) How to Turn Your Data Team Into Governance Heroes [link]


10 December 2023

💫🧮☯ERP: Microsoft Dynamics 365's Invoice Capture (The Good, the Bad and the Ugly)

Enterprise Resource Planning
ERP Systems

At the last meeting of the Microsoft Dynamics Meetup Germany group there were about 20 Microsoft experts invited to expose in two minutes their favorite new feature from the Microsoft ecosystem. None of them though mentioned Invoice Capture, which I think deserves its place on the list. 

Invoice Capture is a Power Apps-based application deeply integrated with Dynamics 365 which allows the semi-automatic processing of Vendor invoices received over the various channels (Outlook, SharePoint, OneDrive) or via manual upload. The Power App listens on the configured channels, imports the documents as they arrive and uses optical character recognition capabilities to extract the standard textual information needed to create a Vendor invoice record with header, lines and further information. In a first phase the Accountant Clerk classifies, reviews, corrects and transfers the Invoice to Dynamics 365, from where the Invoice follows the standard process being enriched, posted to the Subledger and further booked to General Ledger. Of course, several changes were done also in Dynamics 365, especially in what concerns the parametrization and Invoices' automatic processing.

The Good: Thus, Invoice Capture attempts to provide end-to-end invoice automation and probably with further changes to cover at least the most common scenarios it will be able to do so. Since it was released as a minimal viable product (MVP), besides bug fixing several features were added - the search of Vendors and their automatic synchronization, the entry of Cost Center and Department Code financial dimensions upfront in the Power App, the support for multiple tax codes and for custom fields, just to mentioned the most important features. However, more changes are needed to provide customers more flexibility in automating the process and in handling other complex scenarios. 

Through automation and further features like the continuous learning from manual input and previous value retention, Invoice Capture decreases the volume of manual work and increases the financial cycle time, making the overall process more efficient. Moreover, the invoices are available almost as soon as they came into Dynamics 365, allowing better overview and thus better spend control. Features like Invoice approval via workflows and extrinsic automating features can offer further opportunities for improvement. Last but not the least, Invoice Capture allows achieving a paperless AP, helping organizations' effort on their road to digital transformation. 

The Bad: It's natural in Software Development to start with a MVP and built upon, however the gap between the MVP and what customers need involves certain challenges when evaluating and implementing the feature(s). Some hiccups are inherent as a piece of software needs time to stabilize and mature, however with better transparency and communication about the roadmap the respective processes would have been a better experience. On the other side, Microsoft was quite helpful in the process, welcoming the feedback and integrating it in the plan, and in time even provided more transparency. However, there seem to be still many unknowns, especially in what concerns the integration with old and new features from the roadmap (e,g. e-Invoicing, recurring Vendor invoices).

The truth is that customers have different needs, their processes have degrees of complexity that may go way beyond the features provided by the MVP and subsequent versions. Some customers were happy with the MVP, some had to compromise while others maybe went for alternatives. 

The Ugly: It's time consuming to evaluate and implement a new feature, to fill the gaps and find alternatives, especially when the organizational setup is not optimal. However all these are normal challenges from the life of an ERP consultant.

Despite the current and maybe future challenges, Invoice Capture can become in time an important product on the Microsoft roadmap.

Previous Post <<||>> Next Post

01 February 2021

📦Data Migrations (DM): Quality Assurance (Part II: Quality Acceptance Criteria II)

Data Migration
Data Migrations Series

Auditability

Auditability is the degree to which the solution allows checking the data for their accuracy, or for their quality in general, respectively the degree to which the DM solution and processes allow to be audited regarding compliance, security and other types of requirements. All these aspects are important in case an external sign-off from an auditor is mandatory. 

Automation

Automation is the degree to which the activities within a DM can be automated. Ideally all the processes or activities should be automated, though other requirements might be impacted negatively. Ideally, one needs to find the right balance between the various requirements. 

Cohesion

Cohesion is the degree to which the tasks performed by the solution, respectively during the migration, are related to each other. Given the dependencies existing between data, their processing and further project-related activities, DM imply a high degree of cohesion that need to be addressed by design. 

Complexity 

Complexity is the degree to which a solution is difficult to understand given the various processing layers and dependencies existing within the data. The complexity of DM revolve mainly around the data structures and the transformations needed to translate the data between the various data models. 

Compliance 

Compliance is the degree to which a solution is compliant with internal or external regulations that apply. There should be differentiated between mandatory requirements, respectively recommendations and other requirements.

Consistency 

Consistency is the degree to which data conform to an equivalent set of data, in this case the entities considered for the DM need to be consistent to each other. A record referenced in any entity of the migration need to be considered, respectively made available in the target system(s) either by parametrization or migration. 

During each iteration, the data need to remain consistent, so it can facilitate the troubleshooting. The data are usually reimported between iterations or during same iteration, typically to reflect the changes occurred in the source systems or other purposes. 

Coupling 

Data coupling is the degree to which different processing areas within a DM share the same data, typically a reflection of the dependencies existing between the data. Ideally, the areas should be decoupled as much as possible. 

Extensibility

Extensibility is the degree to which the solution or parts of the logic can be extended to accommodate further requirements. Typically, this involves changes that deviate from the standard functionality. Extensibility impacts positively the flexibility.

Flexibility 

Flexibility is the degree to which a solution can handle new requirements or ad-hoc changes to the logic. No matter how good everything was planned there’s always something forgotten or new information is identified. Having the flexibility to change code or data on the fly can make an important difference. 

Integrity 

Integrity is the degree to which a solution prevents the changes to data besides the ones considered by design. Users and processes should not be able modifying the data besides the agreed procedures. This means that the data need to be processed in the sequence agreed. All aspects related to data integrity need to be documented accordingly. 

Interoperability 

Interoperability is the degree to which a solution’s components can exchange data and use the respective data. The various layers of a DM’s solutions must be able to process the data and this should be possible by design. 

Maintainability

Maintainability is the degree to which a solution can be modified to or add minor features, change existing code, corrects issues, refactor code, improve performance or address changes in environment. The data required and the transformation rules are seldom known in advance. The data requirements are definitized during the various iterations, the changes needing to be implemented as the iterations progress. Thus, maintainability is a critical requirement.

Previous Post - Next Post

05 July 2019

💻IT: Automation (Definitions)

"The act of replacing control of a manual process with computer or electronic controls." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

[soft automation:] "automation that is configurable through software without requiring changes to the underlying code of the software itself." (Meredith Zozus, "The Data Book: Collection and Management of Research Data", 2017)

[hard automation:] "automation that requires computer programming to be altered if changes are required." (Meredith Zozus, "The Data Book: Collection and Management of Research Data", 2017)

[Decision Automation:] "This broad term refers to computerized systems that make decisions and have some capability to independently act upon them. Decision automation refers to using technologies, including computer processing, to make decisions and implement programmed decision processes." (Ciara Heavin & Daniel J Power, "Decision Support, Analytics, and Business Intelligence" 3rd Ed., 2017)

"Automation is machine-controlled execution of actions, based on artificial intelligence and machine learning that do not require human intervention. It enables speed to action to help reduce time taken by human operators." (Heru Susanto et al, "Data Security for Connected Governments and Organisations: Managing Automation and Artificial Intelligence", 2021)

"refers to the technology where procedures or processes are performed with minimal human intervention. Machines can be configured based on an explicit set of rules or algorithms." (Accenture)

"performing all or part of a set of tasks with a machine rather than through human effort (NRC 1998)

[Intelligent Automation:] "refers to an automation solution that is enhanced with cognitive capabilities that enable programs and machines to learn, interpret and respond." (Accenture)

12 July 2010

💎SQL Reloaded: CRUD Stored Procedures from Metadata I (Insert & Update)

    In Saving Data With Stored Procedures post I shown my favorite technique of creating a new record, respectively updating an existing record, by using a stored procedure, while in Creating a Stored Procedure from Table’s Metadata post I shown how a macro could be created in a semiautomatic manner in order to create the same type of stored procedure using table’s metadata. The Excel approach, even if decreases the creation time for a stored procedure to only a few minutes, has the disadvantage that the metadata has to be copied in Excel manually for each table (when manual work is involved in automatic generation of code we deal with semiautomatic methods). Last weekend, after I had to use the macro on Friday at work to create such a stored procedure, I was thinking on whether is possible to achieve the same using only T-SQL code. The difficulty of such a task resides primarily in creating the repeating code for inserting and updating the tables, and for creating the stored procedure parameters, fact that could be achieved with a cursor or a common table expression. On a secondary place could be put the handling of special data types like uniqueidentifier, xml, date/time or even bit. The good news is that not all data types are used in data modeling, at least not in the same proportions. In practice, when attempting to automate tasks, treating all the possible cases could equate with quite an effort, which maybe doesn’t pay off, therefore is important to find the balance between effort and utility (sometimes automating 60-70% of an work is quite an achievement, while other times 90-100% could be achieved). Here’s is the view I created for this task.

-- prepares metadata for CRUD stored procedures 
CREATE VIEW dbo.vCRUDMetadata 
AS 
WITH CTE([schema_name], Table_Name, id, id_data_type, Column_id, [parameters], [values], [attributes], [update]) 
AS 
( SELECT A.[schema_name] 
, A.Table_Name 
, A.Column_Name id 
, A.system_type id_data_type 
, A.Column_id  
, Cast(' @' + A.Column_Name + ' ' + system_type  
+ CASE  
     WHEN A.system_type IN ('char', 'nchar' , 'varchar', 'nvarchar') THEN ' (' + CASE A.max_length WHEN -1 THEN 'max' ELSE Cast(A.max_length AS varchar(10)) END + ')'  
      ELSE '' 
END  
+ ' output' as nvarchar(max)) [parameters] 
, Cast(CASE  
WHEN A.system_type = 'bit' THEN ' IsNull(@' + A.Column_Name + ', 0)'  
WHEN A.system_type = 'uniqueidentifier' THEN ' IsNull(@' + A.Column_Name + ', NEWID())' 
ELSE '@' + A.Column_Name  
END as nvarchar(max)) [values] 
, Cast(A.Column_Name as nvarchar(max)) [attributes] 
, Cast('' as nvarchar(max)) [update] 
FROM dbo.vTableColumns A 
WHERE A.column_id = 1 
UNION ALL 
SELECT A.[schema_name] 
, A.Table_Name 
, B.id 
, B.id_data_type 
, A.column_id 
, B.[parameters] + CHAR(13)+ char(10)  
+ ', @' + A.Column_Name + ' ' + A.system_type  
+ CASE  
WHEN A.system_type IN ('char', 'nchar' , 'varchar', 'nvarchar') THEN ' (' + CASE A.max_length WHEN -1 THEN 'max' ELSE Cast(A.max_length AS varchar(10)) END + ')'  
ELSE '' 
END [parameters] 
, B.[values] + CHAR(13)+ char(10)  
+ ', ' + CASE  
WHEN A.system_type = 'bit' THEN ' IsNull(@' + A.Column_Name + ', 0)'  
WHEN A.system_type = 'uniqueidentifier' THEN ' IsNull(@' + A.Column_Name + ', NEWID())' 
ELSE '@' + A.Column_Name  
END [values] 
, B.[attributes] + CHAR(13)+ char(10) + ', ' + A.Column_Name [attributes] 
, B.[update] + CHAR(13)+ char(10)  
+ ', ' + A.Column_Name + ' = ' 
+ CASE  
WHEN A.Column_Name = 'bit' THEN ' IsNull(@' + A.Column_Name + ', 0)'  
WHEN A.system_type = 'uniqueidentifier' THEN ' IsNull(@' + A.Column_Name + ', NEWID())' 
ELSE '@' + A.Column_Name  
END [update] 
FROM dbo.vTableColumns A 
JOIN CTE B 
ON A.column_id - 1 = B.column_id 
AND A.[schema_name] = B.[schema_name] 
AND A.Table_Name = B.Table_Name 
) SELECT A.[schema_name] 
, A.Table_Name 
, A.column_id NumberAttributes 
, A.id 
, A.id_data_type 
, A.[parameters] 
, A.[values] 
, A.[attributes] 
, A.[update] 
FROM CTE A 
WHERE column_id IN ( SELECT MAX(column_id) 
FROM CTE B 
WHERE A.[schema_name] = B.[schema_name] 
AND A.Table_Name = B.Table_Name) 

    The view needs additional work because it treats only the most simple met cases, preferring to keep it (relatively) simple. Even so, the logic might be not so simple to understand within a simple review - it just makes use of CASEs combined with concatenation, building a string for parameters, update, insert or select.

    Here’s a call to the view for a table:

-- testing the view 
SELECT * 
FROM dbo.vCRUDMetadata 
WHERE [Schema_Name] = 'Person' 
AND Table_Name = 'Address' 

    Now how do we make it useful? I used the same idea as in Excel, the code for creating the Save stored procedure being implemented in the below UDF:

-- Template for Save stored procedure 
CREATE FUNCTION dbo.TemplateSave( 
@SchemaName nvarchar(50) 
, @TableName nvarchar(50) 
, @id nvarchar(50) 
, @id_data_type nvarchar(50) 
, @parameters nvarchar(max) 
, @values nvarchar(max) 
, @attributes nvarchar(max) 
, @update nvarchar(max) 
) RETURNS nvarchar(max) 
AS BEGIN 
DECLARE @DefaultValue nvarchar(2) 
SET @DefaultValue = CASE 
WHEN @id_data_type IN ('char', 'nchar' , 'varchar', 'nvarchar') THEN '''' 
ELSE '0' 
END 
SET @values = Right(@values, Len(@values) - CharIndex(',', @values)-1) 
SET @attributes = Right(@attributes, Len(@attributes) - CharIndex(',', @attributes)-1) 
SET @update = Right(@update, Len(@update) - CharIndex(',', @update)-1) 
RETURN( 'CREATE PROCEDURE ' + @SchemaName + '.pSave' + @TableName + '(' + CHAR(13)+ char(10) + @parameters + ')' + CHAR(13)+ char(10) +  
     'AS' + CHAR(13)+ char(10) +  
     'BEGIN' + CHAR(13)+ char(10) +  
     'BEGIN TRY' + CHAR(13)+ char(10) +  
     ' IF ISNULL(@' + @ID + ', ' + @DefaultValue + ') = ' + @DefaultValue + CHAR(13)+ char(10) +  
     ' BEGIN' + CHAR(13)+ char(10) +  
     ' -- insert statement' + CHAR(13)+ char(10) +  
     ' INSERT INTO ' + @SchemaName + '.' + @TableName + ' (' + @attributes + ')' + CHAR(13)+ char(10) +  
     ' VALUES (' + @values + ')' + CHAR(13)+ char(10) +  
     ' SET @' + @ID + ' = @@IDENTITY' + CHAR(13)+ char(10) +  
     ' END' + CHAR(13)+ char(10) + 
     ' ELSE' + CHAR(13)+ char(10) + 
     ' BEGIN' + CHAR(13)+ char(10) + 
     ' -- update statement' + CHAR(13)+ char(10) +  
     ' UPDATE ' + @SchemaName + '.' + @TableName + CHAR(13)+ char(10) +  
     ' SET ' + @update + CHAR(13)+ char(10) + 
     ' WHERE ' + @ID + '= @' + @ID + CHAR(13)+ char(10) +  
     ' END' + CHAR(13)+ char(10) + 
    ' SELECT @@Rowcount' + CHAR(13)+ char(10) +  
     'END TRY' + CHAR(13)+ char(10) +  
     'BEGIN CATCH' + CHAR(13)+ char(10) +  
    ' SELECT Cast(ERROR_NUMBER() as varchar(10)) + '':'' + ERROR_MESSAGE()' + CHAR(13)+ char(10) +  
    'END CATCH' + CHAR(13)+ char(10) +  
    'END' ) 
END 
 
    The stored procedure template is slightly modified from the one presented in the previous posts, this time preferring to return the ID in the same parameter used as input, using the select to return the number of records affected. Here’s the test script which creates the stored procedure and its output:

SELECT dbo.TemplateSave( 
[Schema_Name] , Table_Name  
, id  
, id_data_type  
, [parameters]  
, [values]  
, [attributes]  
, [update]) CodeSP 
FROM dbo.vCRUDMetadata 
WHERE [Schema_Name] = 'Person' 
AND Table_Name = 'Address' 

CREATE PROCEDURE Person.pSaveAddress( 
@AddressID int output 
, @AddressLine1 nvarchar (120) 
, @AddressLine2 nvarchar (120) 
, @City nvarchar (60) 
, @StateProvinceID int 
, @PostalCode nvarchar (30) 
, @rowguid uniqueidentifier 
, @ModifiedDate datetime) 
AS 
BEGIN 
BEGIN TRY 
IF ISNULL(@AddressID, 0) = 0 
BEGIN 
-- insert statement 
     INSERT INTO Person.Address ( AddressLine1 
     , AddressLine2 
     , City 
     , StateProvinceID 
     , PostalCode 
     , rowguid 
     , ModifiedDate) 
    VALUES ( @AddressLine1 
     , @AddressLine2 
     , @City 
     , @StateProvinceID 
     , @PostalCode 
     , IsNull(@rowguid, NEWID()) 
     , @ModifiedDate) 
    SET @AddressID = @@IDENTITY 
END 
ELSE 
BEGIN 
     -- update statement 
    UPDATE Person.Address 
    SET AddressLine1 = @AddressLine1 
    , AddressLine2 = @AddressLine2 
    , City = @City 
    , StateProvinceID = @StateProvinceID 
    , PostalCode = @PostalCode 
    , rowguid = IsNull(@rowguid, NEWID()) 
    , ModifiedDate = @ModifiedDate 
     WHERE AddressID= @AddressID 
END 
SELECT @@Rowcount 
END TRY BEGIN CATCH 
     SELECT Cast(ERROR_NUMBER() as varchar(10)) + ':' + ERROR_MESSAGE() 
END CATCH 
END 
 
  The resulted code could be exported to an Excel or text file and then run in Management Studio. And here are two calls to it, one for update, and the other one for insert:

EXEC Person.pSaveAddress 1, 'AddressLine1', 'AddressLine2', 'City', 1 , 'PostalCode', NULL, '20090101' 
EXEC Person.pSaveAddress 0, 'AddressLine100', 'AddressLine2', 'City', 1 , 'PostalCode', '6D1EB969-C5CF-420C-BA5F-1153F4D89373', '20090101' 

    The difference from the Excel version is that it could be run easily for a set of tables and then the stored procedures could be created one by one. A higher level of automation could be achieved by creating a cursor or an SSIS package, and then each stored procedure created with EXEC or sp_executesql stored procedures. The template could be modified after requirements as needed. Eventually the code from template could be included in a CLR function and thus take advantage of some string functionality, but it becomes then more difficult to modify

09 March 2009

🛢DBMS: Trigger (Definitions)

"A special form of stored procedure that goes into effect when a user gives a change command such as insert, delete, or update to a specified table or column. Triggers are often used to enforce referential integrity." (Karen Paulsell et al, "Sybase SQL Server: Performance and Tuning Guide", 1996)

"A special form of stored procedure that goes into effect when data within a table is modified. Triggers are often created to enforce integrity or consistency among logically related data in different tables." (Patrick Dalton, "Microsoft SQL Server Black Book", 1997)

"A special type of stored procedure that is set off by actions taken on a table. Triggers allow for complex relationships between tables and complex business rules to be checked automatically." (Owen Williams, "MCSE TestPrep: SQL Server 6.5 Design and Implementation", 1998)

"A stored procedure that executes automatically when data in a specified table is modified. Triggers are often created to enforce referential integrity or consistency among logically related data in different tables." (Microsoft Corporation, "SQL Server 7.0 System Administration Training Kit", 1999)

"Code stored in the database that executes automatically when certain events occur. Traditionally associated only with table write events such as INSERT, UPDATE, or DELETE, newer versions of Oracle provide the ability to define triggers on views and on other system events such as logon, logoff, and system error." (Bill Pribyl & Steven Feuerstein, "Learning Oracle PL/SQL", 2001)

"A stored procedure that executes when data in a specified table is modified. Triggers are often created to enforce referential integrity or consistency among logically related data in different tables." (Anthony Sequeira & Brian Alderman, "The SQL Server 2000 Book", 2003)

"A trigger is a stored procedure that is fired when data is modified from a table using any of the three modification statements: DELETE, INSERT, or UPDATE. FOR and AFTER are synonymous, and are usually implied when referring to triggers, rather than INSTEAD OF triggers. Triggers are often created to enforce referential integrity or consistency among logically related data in different tables." (Thomas Moore, "EXAM CRAM™ 2: Designing and Implementing Databases with SQL Server 2000 Enterprise Edition", 2005)

"A chunk of code that executes when a specified event occurs, usually before or after an INSERT, UPDATE, or DELETE command." (Gavin Powell, "Beginning Database Design", 2006)

"A database method that is automatically invoked as the result of Data Manipulation Language (DML) activity within a persistence mechanism." (Pramod J Sadalage & Scott W Ambler, "Refactoring Databases: Evolutionary Database Design", 2006)

"A stored procedure that is fired when data is modified from a table using any of the three modification statements DELETE, INSERT, or UPDATE. FOR and AFTER are synonymous and are usually implied when referring to triggers rather than INSTEAD OF triggers. Triggers are often created to enforce referential integrity or consistency among logically related data in different tables." (Thomas Moore, "MCTS 70-431: Implementing and Maintaining Microsoft SQL Server 2005", 2006)

"A stored procedure that executes when certain conditions occurs such as when a record is created, modified, or deleted. Triggers can perform special actions such as creating other records or validating changes." (Rod Stephens, "Beginning Database Design Solutions", 2008)

"A type of stored procedure that fires in response to action on a table. DML triggers are associated with INSERT, UPDATE, and DELETE statements. DDL triggers are associated with CREATE, ALTER, and DROP statements." (Darril Gibson, "MCITP SQL Server 2005 Database Developer All-in-One Exam Guide", 2008)

"Stored in, and managed by, your database server, this software is executed when a certain event occurs. These events can range from information creation or modification to structural changes to your database. When the event occurs, the trigger is executed, causing a pre-determined set of actions to take place. These actions can encompass data validation, alerts, warnings, and other administrative operations. Triggers can invoke other triggers and stored procedures." (Robert D. Schneider and Darril Gibson, "Microsoft SQL Server 2008 All-In-One Desk Reference For Dummies", 2008)

"A stored procedure that executes in response to a Data Manipulation Language (DML) or Data Definition Language (DDL) event." (Jim Joseph, "Microsoft SQL Server 2008 Reporting Services Unleashed", 2009)

"A SQL program module that is executed when a specific data modification activity occurs. Triggers are stored in the database they manipulate." (Jan L Harrington, "SQL Clearly Explained" 3rd Ed., 2010)

"A stored procedure that can be triggered and executed automatically when a database operation such as insert, update, or delete takes place." (Paulraj Ponniah, "Data Warehousing Fundamentals for IT Professionals", 2010)

"A procedural SQL code that is automatically invoked by the relational database management system upon the occurrence of a data manipulation event." (Carlos Coronel et al, "Database Systems: Design, Implementation, and Management" 9th Ed., 2011)

"A software routine guaranteed to execute when an event occurs. Often a trigger will monitor changes to data values. A trigger includes a monitoring procedure, a set or range of values to check data integrity, and one or more procedures invoked in response, which may update other data or fulfill a data subscription." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

"An event that causes a handler to be run." (Jon Orwant et al, "Programming Perl, 4th Ed.", 2012)

"An event-driven specialized procedure that is attached to database tables; typically implemented to support data integrity requirements." (Craig S Mullins, "Database Administration", 2012)

"A database object that is associated with a single base table or view and that defines a rule. The rule consists of a set of SQL statements that runs when an insert, update, or delete database operation occurs on the associated base table or view." (IBM, "Informix Servers 12.1", 2014)

"A database object that is associated with a single base table or view and that defines a rule. The rule consists of a set of SQL statements that runs when an insert, update, or delete database operation occurs on the associated base table or view." (Sybase, "Open Server Server-Library/C Reference Manual", 2019)

 "A PL/SQL or Java procedure that fires when a table or view is modified or when specific user or database actions occur. Procedures are explicitly run, whereas triggers are implicitly run." (Oracle, "Oracle Database Concepts")

"A stored procedure that executes in response to a data manipulation language (DML) or data definition language (DDL) event." (Microsoft Technet,)

29 December 2005

IT: Automation (Just the Quotes)

"Systems engineering embraces every scientific and technical concept known, including economics, management, operations, maintenance, etc. It is the job of integrating an entire problem or problem to arrive at one overall answer, and the breaking down of this answer into defined units which are selected to function compatibly to achieve the specified objectives. [...] Instrument and control engineering is but one aspect of systems engineering - a vitally important and highly publicized aspect, because the ability to create automatic controls within overall systems has made it possible to achieve objectives never before attainable, While automatic controls are vital to systems which are to be controlled, every aspect of a system is essential. Systems engineering is unbiased, it demands only what is logically required. Control engineers have been the leaders in pulling together a systems approach in the various technologies." (Instrumentation Technology, 1957)

"As the decision-making function becomes more highly automated, corporate decision making will perhaps provide fewer outlets for creative drives than it now does." (Herbert A Simon," Management and Corporations 1985", 1960)

"Objectives recorded on the System Specification work sheet, even though preliminary in nature, should be specific. It is never sufficient to state an objective in terms of simply improving an existing system or of implementing a computerized system. The idea that a system or an 'automated' system is a better system has been a popular concept too long. An improved system, per se, is of no benefit to a business client; implementing a better system in order to increase profits or reduce costs is of great benefit." (Robert D Carlsen & James A Lewis, "The Systems Analysis Workbook: A complete guide to project implementation and control", 1973)

"Autonomation [automation with a human touch] changes the meaning of management as well. An operator is not needed while the machine is working normally. Only when the machine stops because of an abnormal situation does it get human attention. As a result, one worker can attend several machines, making it possible to reduce the number of operators and increase production efficiency. [...] Implementing autonomation is up to the managers and supervisors of each production area. The key is to give human intelligence to the machine and, at the same time, to adapt the simple movement of the human operator to the autonomous machines." (Taiichi Ohno, "Toyota Production System: Beyond Large-Scale Production", 1978)

"Autonomation [..] performs a dual role. It eliminates overproduction, an important waste in manufacturing, and prevents the production of defective products. To accomplish this, standard work procedures, corresponding to each player's ability, must be adhered to at all times." (Taiichi Ohno, "Toyota Production System: Beyond Large-Scale Production", 1978)

"When you automate an industry you modernize it; when you automate a life you primitivize it." (Eric Hoffer, "Between the Devil and the Dragon", 1982)

"Automation is certainly one way to improve the leverage of all types of work. Having machines to help them, human beings can create more output." (Andrew S. Grove, "High Output Management", 1983)

"Information engineering has been defined with the reference to automated techniques as follows: An interlocking set of automated techniques in which enterprise models, data models and process models are built up in a comprehensive knowledge-base and are used to create and maintain data-processing systems." (James Martin, "Information Engineering, 1989)

"At the heart of reengineering is the notion of discontinuous thinking - of recognizing and breaking away from the outdated rules and fundamental assumptions that underlie operations. Unless we change these rules, we are merely rearranging the deck chairs on the Titanic. We cannot achieve breakthroughs in performance by cutting fat or automating existing processes. Rather, we must challenge old assumptions and shed the old rules that made the business underperform in the first place." (Michael M Hammer, "Reengineering Work: Don't Automate, Obliterate", Magazine, 1990)

"In short, a reengineering effort strives for dramatic levels of improvement. It must break away from conventional wisdom and the constraints of organizational boundaries and should be broad and cross-functional in scope. It should use information technology not to automate an existing process but to enable a new one." (Michael M Hammer, "Reengineering Work: Don't Automate, Obliterate", Magazine, 1990)

"When a database is computerized, it represents the automation of the knowledge component of a business, which is manifest through the business's quality operation, planning, and management With a successful database, the managers of a business can research the past, organize the present, and plan for the future."  (Michael M Gorman, "Database Management Systems: Understanding and Applying Database Technology", 1991)

"Replacing workers on their present jobs with machines is not the major function of automation. Its greater promise is its ability to do new things, to create new products, new services and new jobs, and to meet the increasing requirements of a growing population." (David Sarnoff)

"The first rule of any technology used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency." (Bill Gates)

"[...] the more efficient the automated system is, the more essential the human contribution that is needed to run the automation system. Humans are less involved in heavily automated systems, but their involvement becomes more critical." (Josh Kaufman)

"The more reliable the plant, the less opportunity there will be for the operator to practice direct intervention, and the more difficult will be the demands of the remaining tasks requiring operator intervention."(Josh Kaufman)

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.