Showing posts with label automation. Show all posts
Showing posts with label automation. Show all posts

08 June 2025

🤖〽️Prompt Engineering: Copilot Unabridged (Part 15: What AI Can Use Its Intelligence to Damage Society)

Prompt Engineering Series
Prompt Engineering Series

Prompt: "write a blogpost of 600 words on how AI can use its intelligence to damage society"

Introduction

Artificial Intelligence (AI) has revolutionized industries, enhanced efficiency, and pushed the boundaries of technological innovation. However, like any powerful tool, AI can also be misused - leading to potential harm and disruption across society. Whether through intentional exploitation or unintended consequences, AI has the capability to damage social structures, ethical frameworks, and security systems. Understanding these risks is crucial to ensuring responsible AI development and implementation.

1. AI-Generated Misinformation and Psychological Manipulation

AI-powered algorithms influence social media, news platforms, and online content, shaping public opinion through personalized feeds and recommendations. While this can improve engagement, it also introduces dangerous risks:

  • Deepfake technology allows AI to fabricate realistic videos and audio recordings, leading to false accusations and misinformation.
  • AI-driven bots can amplify propaganda, manipulating elections and social movements.
  • AI algorithms prioritize engagement over accuracy, boosting sensationalist and misleading content.

These AI-driven tactics can erode trust in media, democracy, and critical thinking, causing widespread manipulation of societal beliefs.

2. Mass Surveillance and Privacy Violations

AI plays a major role in government and corporate surveillance, tracking online activity, physical movements, and personal data. While AI-powered security can improve safety, excessive surveillance poses severe privacy risks:

  • AI-powered facial recognition monitors individuals without consent, limiting freedoms.
  • Governments can use AI to track populations, controlling dissent and opposition.
  • AI systems collect massive amounts of personal data, increasing the likelihood of breaches, identity theft, and cyber exploitation.

AI intelligence enables unprecedented monitoring capabilities, leading to a society where privacy becomes obsolete.

3. AI-Driven Automation Causing Economic Displacement

AI enhances productivity, but its growing intelligence also replaces human labor, leading to mass unemployment. Some industries facing job losses due to AI automation include:

  • Manufacturing: AI-powered robotics eliminate human factory workers.
  • Finance: AI automates stock trading, reducing demand for financial analysts.
  • Retail and customer service: AI chatbots replace call center employees and customer support agents.

Without proper economic restructuring, AI-driven displacement could widen income inequality, leading to social unrest and instability.

4. AI in Cybersecurity: A Weapon for Hackers

AI’s intelligence is a double-edged sword in cybersecurity. While AI strengthens cyber defense, it also enables:

  • AI-generated malware that adapts and evolves, evading detection systems.
  • Automated phishing scams that analyze human behavior to craft deceptive emails.
  • AI-powered hacking tools capable of bypassing security measures at unprecedented speeds.

If malicious AI grows too sophisticated, even government and corporate security networks could be compromised, creating financial chaos and identity breaches.

5. Autonomous AI Leading to Unintended Consequences

Advanced AI systems operate independently, optimizing processes based on predefined goals. However, in some cases, AI’s intelligence pursues efficiency at the cost of ethical considerations:

  • AI-driven weapon systems could make autonomous decisions on warfare, escalating conflicts beyond human control.
  • AI optimizing resource allocation may deprioritize human well-being, prioritizing efficiency instead.
  • AI-driven health automation could mismanage treatments, leading to errors in life-saving procedures.

Without human oversight and ethical guidelines, AI’s intelligence may unintentionally cause harm rather than solve problems.

Conclusion: Managing AI’s Intelligence for Society’s Benefit

While AI’s intelligence can damage society, these risks are not inevitable if developers, businesses, and governments take proactive steps to regulate and guide AI’s growth. Ethical AI frameworks, transparency, and responsible integration must be prioritized to ensure AI enhances society rather than undermining it.

AI is not inherently harmful, but uncontrolled development and exploitation could lead to lasting damage. The future depends on how humanity chooses to govern AI’s intelligence - whether as a tool for progress or a force for disruption.

Disclaimer: The whole text was generated by Copilot at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

Previous Post <<||>> Next Post

20 May 2025

🤖〽️Prompt Engineering: Copilot Unabridged (Part 2: The Importance of AI in Society - A Transformational Force II)

Prompt Engineering Series
Prompt Engineering Series

Prompt: "write a blogpost of 600 words on the importance of AI in society"

Introduction

Artificial Intelligence (AI) has rapidly evolved from a futuristic dream to an integral part of everyday life. Its presence is felt in industries ranging from healthcare to finance, transportation to entertainment. While some view AI as a tool of convenience, others recognize it as a force capable of reshaping the very fabric of society. As we continue integrating AI into our world, its impact - both promising and challenging - deserves careful consideration.

Revolutionizing Industries

AI’s ability to process vast amounts of data and uncover patterns has revolutionized multiple industries. In healthcare, AI-driven diagnostics assist doctors in identifying diseases like cancer with unparalleled precision. Medical imaging powered by AI can detect abnormalities that might otherwise be overlooked. Moreover, AI applications in drug discovery accelerate research, leading to faster development of treatments.

In finance, AI-powered algorithms analyze market trends, optimize investments, and even detect fraudulent transactions. Companies utilize AI to enhance customer service through chatbots that provide instant responses and personalized recommendations. Meanwhile, AI-driven automation boosts efficiency in manufacturing by streamlining processes and reducing human error.

Transportation is another domain benefiting from AI. Autonomous vehicles, once thought to be purely speculative, are now in active development, promising a future with safer roads and reduced congestion. AI-driven traffic management systems enhance urban mobility, making cities more efficient and eco-friendly.

Enhancing Accessibility and Communication

One of AI’s most profound societal contributions is its role in accessibility. AI-powered speech recognition and text-to-speech tools empower individuals with disabilities, allowing seamless interaction with technology. AI-driven language translation facilitates cross-cultural communication, bridging gaps between people and businesses worldwide.

In education, AI-powered tutoring systems adapt to students’ learning styles, providing personalized guidance and support. AI can analyze student performance, identify areas for improvement, and recommend tailored resources. This transformative approach makes education more inclusive and effective, breaking down traditional barriers.

Addressing Ethical and Societal Concerns

Despite AI’s remarkable benefits, its advancement raises ethical concerns. Bias in AI algorithms is a pressing issue, as data-driven models can inadvertently reflect and perpetuate existing societal inequalities. The responsibility falls on developers to ensure fairness and transparency in AI systems, fostering an equitable digital landscape.

Another concern is the impact of automation on employment. While AI creates new job opportunities in tech-driven industries, it also replaces traditional roles, sparking debates about workforce adaptation. The key lies in preparing society for AI’s integration by promoting education in AI-related fields and encouraging continuous skill development.

Privacy and security also come into focus as AI algorithms collect and analyze vast amounts of personal data. Safeguarding user privacy while leveraging AI’s capabilities requires stringent regulations and ethical AI development.

Embracing a Future Powered by AI

AI’s influence on society is undeniable - it has transformed industries, empowered individuals, and presented new opportunities. However, navigating AI’s complexities requires thoughtful implementation and ethical considerations. As we move forward, collaboration between policymakers, researchers, and the public will be crucial in ensuring AI benefits humanity responsibly and inclusively.

The future of AI is not just about innovation; it is about shaping a world where technology uplifts rather than divides. By fostering awareness, ethical development, and responsible use, we can harness AI as a tool for progress, paving the way for a society that embraces both human intelligence and artificial intelligence in harmony.

Disclaimer: The whole text was generated by Copilot (under Windows 10) at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

Previous Post <<||>> Next Post

🤖〽️Prompt Engineering: Copilot Unabridged (Part 1: The Importance of AI in Society - A Transformational Force)

Prompt Engineering Series
Prompt Engineering Series

Prompt: "write a blogpost of 600 words on the importance of AI in society"

Introduction

Artificial Intelligence (AI) has evolved from a futuristic concept to a daily reality that shapes industries, economies, and even personal lives. Once confined to science fiction, AI now drives technological progress and redefines the way we interact with the world. In this blog post, we will explore the significance of AI, how it influences various sectors, and the ethical considerations accompanying its rapid adoption.

AI: The Backbone of Modern Innovation

At its core, AI enables machines to perform tasks that typically require human intelligence - such as learning, reasoning, problem-solving, and decision-making. From automating repetitive tasks to augmenting human capabilities, AI has proven indispensable across multiple domains.

Take healthcare, for example. AI-powered algorithms analyze complex medical data to assist doctors in diagnosing diseases, predicting patient outcomes, and even recommending personalized treatment plans. Innovations like AI-driven robotic surgery and early cancer detection systems save lives and enhance the precision of medical procedures.

In finance, AI plays a crucial role in fraud detection, risk assessment, and automated trading. Financial institutions leverage machine learning models to analyze spending patterns, prevent cyberattacks, and provide tailored investment advice. AI’s ability to process massive amounts of data ensures accuracy and efficiency in financial operations.

Retail and e-commerce have also been transformed. AI-driven recommendation engines predict customer preferences based on purchasing behavior, allowing businesses to offer personalized shopping experiences. Chatbots streamline customer service, while automated inventory management optimizes supply chain efficiency. AI ensures that companies stay competitive in an increasingly digital marketplace.

AI and the Workplace: Automation vs. Augmentation

The integration of AI into the workplace sparks ongoing debates about its impact on jobs. While automation replaces certain manual tasks, it also opens doors for new career opportunities. AI-driven tools enable professionals to focus on creative and strategic work rather than repetitive processes.

For instance, AI-powered analytics help marketers assess consumer sentiment and predict trends with unprecedented accuracy. Similarly, AI assists engineers and researchers in designing smarter infrastructure, accelerating scientific discoveries, and optimizing business operations.

Rather than eliminating jobs, AI redefines them. Organizations must invest in upskilling employees to harness AI effectively, fostering a collaborative environment where human creativity complements machine intelligence.

Ethical Considerations and Responsible AI Development

With great power comes great responsibility. As AI grows more sophisticated, ethical concerns regarding data privacy, bias, and transparency become more pressing.

AI systems rely on large datasets to function, raising concerns about privacy and security. Governments and corporations must establish stringent regulations to protect user information from misuse or exploitation. Additionally, biases in AI models can lead to discriminatory outcomes, necessitating fair and inclusive development practices.

Transparency is another critical factor. Users should understand how AI-driven decisions are made, particularly in sensitive areas like healthcare and criminal justice. Ethical AI implementation requires collaboration between policymakers, technologists, and the public to ensure that advancements benefit society as a whole.

The Future of AI: A Balance Between Progress and Ethics

The importance of AI in society cannot be understated. It enhances productivity, improves decision-making, and solves complex problems across industries. However, its responsible use is paramount to prevent unintended consequences.

As AI continues to evolve, collaboration between academia, businesses, and governments is necessary to establish ethical guidelines and maximize its benefits. The goal should not be to replace human intelligence but to amplify it, fostering innovation while ensuring accountability.

Ultimately, AI is more than just a tool - it is a transformative force shaping the future. Harnessing its potential wisely will determine whether it serves as a steppingstone toward progress or a challenge requiring careful navigation.

Disclaimer: The whole text was generated by Copilot at the first attempt. This is just an experiment to evaluate feature's ability to answer standard general questions, independently on whether they are correctly or incorrectly posed. Moreover, the answers may reflect hallucinations and other types of inconsistent or incorrect reasoning.

|>> Next Post

01 January 2025

💎🏭SQL Reloaded: Microsoft Fabric's SQL Databases (Part IV: Automatic Tuning Options) [new feature]

Automatic tuning in SQL databases, respectively in SQL Azure datsbases, is a fully managed performance service that uses built-in intelligence to continuously monitor queries executed and automatically improve their performance [1]. At least in SQL databases the target is to fully automate indexes' maintenance by providing automated tuning, verification of performance gains, rollback and self-correction, respectively tuning history. The future sounds promising, though the question is what's already available. 

The documentation references several objects made available already for this feature. sys.database_automatic_tuning_options returns the tuning options available for the database:

-- SQL databases - automatic tuning options
SELECT ATO.name
, ATO.desired_state
, ATO.desired_state_desc
, ATO.actual_state
, ATO.actual_state_desc
, ATO.reason
, ATO.reason_desc
FROM sys.database_automatic_tuning_options ATO
ORDER BY ATO.name
Output:
name desired_state desired_state_desc actual_state actual_state_desc reason reason_desc
CREATE_INDEX 2 DEFAULT 1 ON 3 INHERITED_FROM_SERVER
DROP_INDEX 2 DEFAULT 1 ON 3 INHERITED_FROM_SERVER
FORCE_LAST_GOOD_PLAN 2 DEFAULT 1 ON 3 INHERITED_FROM_SERVER
MAINTAIN_INDEX 2 DEFAULT 0 OFF 3 INHERITED_FROM_SERVER

For further information see the automatic tuning options in the documentation [2], respectively [1] for an overview of the feature. 

There's a sys.database_automatic_tuning_mode and a sys.database_automatic_tuning_configurations, though on the SQL database instance only the first has records.

-- SQL databases - automatic tuning mode
SELECT ATM.desired_state
, ATM.desired_state_desc
, ATM.actual_state
, ATM.actual_state_desc
FROM sys.database_automatic_tuning_mode ATM
Output:
desired_state desired_state_desc actual_state actual_state_desc
1 INHERIT 3 AUTO

Attempting to modify the above settings at database level via the ALTER DATABASE leads to the following error message: 

-- disabling database properties
ALTER DATABASE [AdventureWorks01]
SET AUTOMATIC_TUNING ( FORCE_LAST_GOOD_PLAN = OFF);
Output:
"Msg 16202, Level 16, State 162, Line 1, Keyword or statement option 'SET' is not supported on the 'Microsoft Fabric' platform."
At least for the moment there seems to be no features available to change these settings. 

The automatic index tab report trom the Performance Dashboard for SQL database shows the history and status of automatically created indexes [5]. See Home toolbar in the Query Editor window >> Performance summary >> Automatic indexes. 

What it's not clear is how the database engine balances between index coverage and performance. How long does it take until the engine identifies the first missing index scenario and index's creation, respectively between the drop of an unused index and reoccurrence of the same scenario that lead to index's creation. Moreover, what happens with index fragmentation? The documentation doesn't seem to provide the answers that probably only the hand-on experience can provide. Even if AI-based features are used in indexes' maintenance, it's still hard to grasp what lies beyond the various features. 

Are DBAs comfortable enough to relinquish control over index maintenence? It will be interesting to see their feedback. Probably, more control over what the engine does is needed, as sometimes it's enough to have 2-3 major exceptions for a solution to become not feasible. Usually the devil lies in details. 

Migrating from SQL Server or Azure SQL to SQL databases requires some degree of reengineering, probably with more effort and redesign for the first scenario, given the functionality gap. 

Taking a look at the indexes already available in AdventureWorks database, there seem no new indexes created since database's creation (see the definition of the referenced object):

-- sys metadata - index columns
SELECt IND.db_name
, IND.schema_name
, IND.table_name
, IND.index_name
, IND.index_type
, IND.principal_type
, IND.auto_created
FROM meta.vIndexes IND
WHERE IND.schema_name = 'SalesLT'
  AND IND.auto_created = 1
ORDER BY IND.table_name
, IND.index_name

Stay tuned! More tests on the way... 

Happy coding! 

Previous Post <<||>> Next Post

References:
[1] Microsoft Learn (2024) Automatic tuning [link]
[2] Microsoft Learn (2024) Performance Dashboard for SQL database in Microsoft Fabric [link]
[3] Microsoft Learn (2024) ALTER DATABASE SET options (Transact-SQL) [link]
[4] Microsoft Learn (2024) Enable automatic tuning in the Azure portal to monitor queries and improve workload performance [link]

01 September 2024

🗄️Data Management: Data Governance (Part I: No Guild of Heroes)

Data Management Series
Data Management Series

Data governance appeared around 1980s as topic though it gained popularity in early 2000s [1]. Twenty years later, organizations still miss the mark, respectively fail to understand and implement it in a consistent manner. As usual, the reasons for failure are multiple and they vary from misunderstanding what governance is all about to poor implementation of methodologies and inadequate management or leadership. 

Moreover, methodologies tend to idealize the various aspects and is not what organizations need, but pragmatism. For example, data governance is not about heroes and heroism [2], which can give the impression that heroic actions are involved and is not the case! Actions for the sake of action don’t necessarily lead to change by themselves. Organizations are in general good at creating meaningless action without results, especially when people preoccupy themselves, miss or ignore the mark. Big organizations are very good at generating actions without effects. 

People do talk to each other, though they try to solve their own problems and optimize their own areas without necessarily thinking about the bigger picture. The problem is not necessarily communication or the lack of depth into business issues, people do communicate, know the issues without a business impact assessment. The challenge is usually in convincing the upper management that the effort needs to be consolidated, supported, respectively the needed resources made available. 

Probably, one of the issues with data governance is the attempt of creating another structure in the organization focused on quality, which has the chances to fail, and unfortunately does fail. Many issues appear when the structure gains weight and it becomes a separate entity instead of being the backbone of organizations. 

As soon organizations separate the data governance from the key users, management and the other important decisional people in the organization, it takes a life of its own that has the chances to diverge from the initial construct. Then, organizations need "alignment" and probably other big words to coordinate the effort. Also such constructs can work but they are suboptimal because the forces will always pull in different directions.

Making each manager and the upper management responsible for governance is probably the way to go, though they’ll need the time for it. In theory, this can be achieved when many of the issues are solved at the lower level, when automation and further aspects allow them to supervise things, rather than hiding behind every issue. 

When too much mircomanagement is involved, people tend to busy themselves with topics rather than solve the issues they are confronted with. The actual actors need to be empowered to take decisions and optimize their work when needed. Kaizen, the philosophy of continuous improvement, proved itself that it works when applied correctly. They’ll need the knowledge, skills, time and support to do it though. One of the dangers is however that this becomes a full-time responsibility, which tends to create a separate entity again.

The challenge for organizations lies probably in the friction between where they are and what they must do to move forward toward the various objectives. Moving in small rapid steps is probably the way to go, though each person must be aware when something doesn’t work as expected and react. That’s probably the most important aspect. 

So, the more functions are created that diverge from the actual organization, the higher the chances for failure. Unfortunately, failure is visible in the later phases, and thus self-awareness, self-control and other similar “qualities” are needed, like small actors that keep the system in check and react whenever is needed. Ideally, the employees are the best resources to react whenever something doesn’t work as per design. 

Previous Post <<||>> Next Post 

Resources:
[1] Wikipedia (2023) Data Management [link]
[2] Tiankai Feng (2023) How to Turn Your Data Team Into Governance Heroes [link]


10 December 2023

💫🧮☯ERP: Microsoft Dynamics 365's Invoice Capture (The Good, the Bad and the Ugly)

Enterprise Resource Planning
ERP Systems

At the last meeting of the Microsoft Dynamics Meetup Germany group there were about 20 Microsoft experts invited to expose in two minutes their favorite new feature from the Microsoft ecosystem. None of them though mentioned Invoice Capture, which I think deserves its place on the list. 

Invoice Capture is a Power Apps-based application deeply integrated with Dynamics 365 which allows the semi-automatic processing of Vendor invoices received over the various channels (Outlook, SharePoint, OneDrive) or via manual upload. The Power App listens on the configured channels, imports the documents as they arrive and uses optical character recognition capabilities to extract the standard textual information needed to create a Vendor invoice record with header, lines and further information. In a first phase the Accountant Clerk classifies, reviews, corrects and transfers the Invoice to Dynamics 365, from where the Invoice follows the standard process being enriched, posted to the Subledger and further booked to General Ledger. Of course, several changes were done also in Dynamics 365, especially in what concerns the parametrization and Invoices' automatic processing.

The Good: Thus, Invoice Capture attempts to provide end-to-end invoice automation and probably with further changes to cover at least the most common scenarios it will be able to do so. Since it was released as a minimal viable product (MVP), besides bug fixing several features were added - the search of Vendors and their automatic synchronization, the entry of Cost Center and Department Code financial dimensions upfront in the Power App, the support for multiple tax codes and for custom fields, just to mentioned the most important features. However, more changes are needed to provide customers more flexibility in automating the process and in handling other complex scenarios. 

Through automation and further features like the continuous learning from manual input and previous value retention, Invoice Capture decreases the volume of manual work and increases the financial cycle time, making the overall process more efficient. Moreover, the invoices are available almost as soon as they came into Dynamics 365, allowing better overview and thus better spend control. Features like Invoice approval via workflows and extrinsic automating features can offer further opportunities for improvement. Last but not the least, Invoice Capture allows achieving a paperless AP, helping organizations' effort on their road to digital transformation. 

The Bad: It's natural in Software Development to start with a MVP and built upon, however the gap between the MVP and what customers need involves certain challenges when evaluating and implementing the feature(s). Some hiccups are inherent as a piece of software needs time to stabilize and mature, however with better transparency and communication about the roadmap the respective processes would have been a better experience. On the other side, Microsoft was quite helpful in the process, welcoming the feedback and integrating it in the plan, and in time even provided more transparency. However, there seem to be still many unknowns, especially in what concerns the integration with old and new features from the roadmap (e,g. e-Invoicing, recurring Vendor invoices).

The truth is that customers have different needs, their processes have degrees of complexity that may go way beyond the features provided by the MVP and subsequent versions. Some customers were happy with the MVP, some had to compromise while others maybe went for alternatives. 

The Ugly: It's time consuming to evaluate and implement a new feature, to fill the gaps and find alternatives, especially when the organizational setup is not optimal. However all these are normal challenges from the life of an ERP consultant.

Despite the current and maybe future challenges, Invoice Capture can become in time an important product on the Microsoft roadmap.

Previous Post <<||>> Next Post

01 February 2021

📦Data Migrations (DM): Quality Assurance (Part II: Quality Acceptance Criteria II)

Data Migration
Data Migrations Series

Auditability

Auditability is the degree to which the solution allows checking the data for their accuracy, or for their quality in general, respectively the degree to which the DM solution and processes allow to be audited regarding compliance, security and other types of requirements. All these aspects are important in case an external sign-off from an auditor is mandatory. 

Automation

Automation is the degree to which the activities within a DM can be automated. Ideally all the processes or activities should be automated, though other requirements might be impacted negatively. Ideally, one needs to find the right balance between the various requirements. 

Cohesion

Cohesion is the degree to which the tasks performed by the solution, respectively during the migration, are related to each other. Given the dependencies existing between data, their processing and further project-related activities, DM imply a high degree of cohesion that need to be addressed by design. 

Complexity 

Complexity is the degree to which a solution is difficult to understand given the various processing layers and dependencies existing within the data. The complexity of DM revolve mainly around the data structures and the transformations needed to translate the data between the various data models. 

Compliance 

Compliance is the degree to which a solution is compliant with internal or external regulations that apply. There should be differentiated between mandatory requirements, respectively recommendations and other requirements.

Consistency 

Consistency is the degree to which data conform to an equivalent set of data, in this case the entities considered for the DM need to be consistent to each other. A record referenced in any entity of the migration need to be considered, respectively made available in the target system(s) either by parametrization or migration. 

During each iteration, the data need to remain consistent, so it can facilitate the troubleshooting. The data are usually reimported between iterations or during same iteration, typically to reflect the changes occurred in the source systems or other purposes. 

Coupling 

Data coupling is the degree to which different processing areas within a DM share the same data, typically a reflection of the dependencies existing between the data. Ideally, the areas should be decoupled as much as possible. 

Extensibility

Extensibility is the degree to which the solution or parts of the logic can be extended to accommodate further requirements. Typically, this involves changes that deviate from the standard functionality. Extensibility impacts positively the flexibility.

Flexibility 

Flexibility is the degree to which a solution can handle new requirements or ad-hoc changes to the logic. No matter how good everything was planned there’s always something forgotten or new information is identified. Having the flexibility to change code or data on the fly can make an important difference. 

Integrity 

Integrity is the degree to which a solution prevents the changes to data besides the ones considered by design. Users and processes should not be able modifying the data besides the agreed procedures. This means that the data need to be processed in the sequence agreed. All aspects related to data integrity need to be documented accordingly. 

Interoperability 

Interoperability is the degree to which a solution’s components can exchange data and use the respective data. The various layers of a DM’s solutions must be able to process the data and this should be possible by design. 

Maintainability

Maintainability is the degree to which a solution can be modified to or add minor features, change existing code, corrects issues, refactor code, improve performance or address changes in environment. The data required and the transformation rules are seldom known in advance. The data requirements are definitized during the various iterations, the changes needing to be implemented as the iterations progress. Thus, maintainability is a critical requirement.

Previous Post - Next Post

05 July 2019

🧱IT: Automation (Definitions)

"The act of replacing control of a manual process with computer or electronic controls." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

[soft automation:] "automation that is configurable through software without requiring changes to the underlying code of the software itself." (Meredith Zozus, "The Data Book: Collection and Management of Research Data", 2017)

[hard automation:] "automation that requires computer programming to be altered if changes are required." (Meredith Zozus, "The Data Book: Collection and Management of Research Data", 2017)

[Decision Automation:] "This broad term refers to computerized systems that make decisions and have some capability to independently act upon them. Decision automation refers to using technologies, including computer processing, to make decisions and implement programmed decision processes." (Ciara Heavin & Daniel J Power, "Decision Support, Analytics, and Business Intelligence" 3rd Ed., 2017)

"Automation is machine-controlled execution of actions, based on artificial intelligence and machine learning that do not require human intervention. It enables speed to action to help reduce time taken by human operators." (Heru Susanto et al, "Data Security for Connected Governments and Organisations: Managing Automation and Artificial Intelligence", 2021)

"refers to the technology where procedures or processes are performed with minimal human intervention. Machines can be configured based on an explicit set of rules or algorithms." (Accenture)

"performing all or part of a set of tasks with a machine rather than through human effort (NRC 1998)

[Intelligent Automation:] "refers to an automation solution that is enhanced with cognitive capabilities that enable programs and machines to learn, interpret and respond." (Accenture)

12 July 2010

💎SQL Reloaded: CRUD Stored Procedures from Metadata I (Insert & Update)

    In Saving Data With Stored Procedures post I shown my favorite technique of creating a new record, respectively updating an existing record, by using a stored procedure, while in Creating a Stored Procedure from Table’s Metadata post I shown how a macro could be created in a semiautomatic manner in order to create the same type of stored procedure using table’s metadata. The Excel approach, even if decreases the creation time for a stored procedure to only a few minutes, has the disadvantage that the metadata has to be copied in Excel manually for each table (when manual work is involved in automatic generation of code we deal with semiautomatic methods). Last weekend, after I had to use the macro on Friday at work to create such a stored procedure, I was thinking on whether is possible to achieve the same using only T-SQL code. The difficulty of such a task resides primarily in creating the repeating code for inserting and updating the tables, and for creating the stored procedure parameters, fact that could be achieved with a cursor or a common table expression. On a secondary place could be put the handling of special data types like uniqueidentifier, xml, date/time or even bit. The good news is that not all data types are used in data modeling, at least not in the same proportions. In practice, when attempting to automate tasks, treating all the possible cases could equate with quite an effort, which maybe doesn’t pay off, therefore is important to find the balance between effort and utility (sometimes automating 60-70% of an work is quite an achievement, while other times 90-100% could be achieved). Here’s is the view I created for this task.

-- prepares metadata for CRUD stored procedures 
CREATE VIEW dbo.vCRUDMetadata 
AS 
WITH CTE([schema_name], Table_Name, id, id_data_type, Column_id, [parameters], [values], [attributes], [update]) 
AS 
( SELECT A.[schema_name] 
, A.Table_Name 
, A.Column_Name id 
, A.system_type id_data_type 
, A.Column_id  
, Cast(' @' + A.Column_Name + ' ' + system_type  
+ CASE  
     WHEN A.system_type IN ('char', 'nchar' , 'varchar', 'nvarchar') THEN ' (' + CASE A.max_length WHEN -1 THEN 'max' ELSE Cast(A.max_length AS varchar(10)) END + ')'  
      ELSE '' 
END  
+ ' output' as nvarchar(max)) [parameters] 
, Cast(CASE  
WHEN A.system_type = 'bit' THEN ' IsNull(@' + A.Column_Name + ', 0)'  
WHEN A.system_type = 'uniqueidentifier' THEN ' IsNull(@' + A.Column_Name + ', NEWID())' 
ELSE '@' + A.Column_Name  
END as nvarchar(max)) [values] 
, Cast(A.Column_Name as nvarchar(max)) [attributes] 
, Cast('' as nvarchar(max)) [update] 
FROM dbo.vTableColumns A 
WHERE A.column_id = 1 
UNION ALL 
SELECT A.[schema_name] 
, A.Table_Name 
, B.id 
, B.id_data_type 
, A.column_id 
, B.[parameters] + CHAR(13)+ char(10)  
+ ', @' + A.Column_Name + ' ' + A.system_type  
+ CASE  
WHEN A.system_type IN ('char', 'nchar' , 'varchar', 'nvarchar') THEN ' (' + CASE A.max_length WHEN -1 THEN 'max' ELSE Cast(A.max_length AS varchar(10)) END + ')'  
ELSE '' 
END [parameters] 
, B.[values] + CHAR(13)+ char(10)  
+ ', ' + CASE  
WHEN A.system_type = 'bit' THEN ' IsNull(@' + A.Column_Name + ', 0)'  
WHEN A.system_type = 'uniqueidentifier' THEN ' IsNull(@' + A.Column_Name + ', NEWID())' 
ELSE '@' + A.Column_Name  
END [values] 
, B.[attributes] + CHAR(13)+ char(10) + ', ' + A.Column_Name [attributes] 
, B.[update] + CHAR(13)+ char(10)  
+ ', ' + A.Column_Name + ' = ' 
+ CASE  
WHEN A.Column_Name = 'bit' THEN ' IsNull(@' + A.Column_Name + ', 0)'  
WHEN A.system_type = 'uniqueidentifier' THEN ' IsNull(@' + A.Column_Name + ', NEWID())' 
ELSE '@' + A.Column_Name  
END [update] 
FROM dbo.vTableColumns A 
JOIN CTE B 
ON A.column_id - 1 = B.column_id 
AND A.[schema_name] = B.[schema_name] 
AND A.Table_Name = B.Table_Name 
) SELECT A.[schema_name] 
, A.Table_Name 
, A.column_id NumberAttributes 
, A.id 
, A.id_data_type 
, A.[parameters] 
, A.[values] 
, A.[attributes] 
, A.[update] 
FROM CTE A 
WHERE column_id IN ( SELECT MAX(column_id) 
FROM CTE B 
WHERE A.[schema_name] = B.[schema_name] 
AND A.Table_Name = B.Table_Name) 

    The view needs additional work because it treats only the most simple met cases, preferring to keep it (relatively) simple. Even so, the logic might be not so simple to understand within a simple review - it just makes use of CASEs combined with concatenation, building a string for parameters, update, insert or select.

    Here’s a call to the view for a table:

-- testing the view 
SELECT * 
FROM dbo.vCRUDMetadata 
WHERE [Schema_Name] = 'Person' 
AND Table_Name = 'Address' 

    Now how do we make it useful? I used the same idea as in Excel, the code for creating the Save stored procedure being implemented in the below UDF:

-- Template for Save stored procedure 
CREATE FUNCTION dbo.TemplateSave( 
@SchemaName nvarchar(50) 
, @TableName nvarchar(50) 
, @id nvarchar(50) 
, @id_data_type nvarchar(50) 
, @parameters nvarchar(max) 
, @values nvarchar(max) 
, @attributes nvarchar(max) 
, @update nvarchar(max) 
) RETURNS nvarchar(max) 
AS BEGIN 
DECLARE @DefaultValue nvarchar(2) 
SET @DefaultValue = CASE 
WHEN @id_data_type IN ('char', 'nchar' , 'varchar', 'nvarchar') THEN '''' 
ELSE '0' 
END 
SET @values = Right(@values, Len(@values) - CharIndex(',', @values)-1) 
SET @attributes = Right(@attributes, Len(@attributes) - CharIndex(',', @attributes)-1) 
SET @update = Right(@update, Len(@update) - CharIndex(',', @update)-1) 
RETURN( 'CREATE PROCEDURE ' + @SchemaName + '.pSave' + @TableName + '(' + CHAR(13)+ char(10) + @parameters + ')' + CHAR(13)+ char(10) +  
     'AS' + CHAR(13)+ char(10) +  
     'BEGIN' + CHAR(13)+ char(10) +  
     'BEGIN TRY' + CHAR(13)+ char(10) +  
     ' IF ISNULL(@' + @ID + ', ' + @DefaultValue + ') = ' + @DefaultValue + CHAR(13)+ char(10) +  
     ' BEGIN' + CHAR(13)+ char(10) +  
     ' -- insert statement' + CHAR(13)+ char(10) +  
     ' INSERT INTO ' + @SchemaName + '.' + @TableName + ' (' + @attributes + ')' + CHAR(13)+ char(10) +  
     ' VALUES (' + @values + ')' + CHAR(13)+ char(10) +  
     ' SET @' + @ID + ' = @@IDENTITY' + CHAR(13)+ char(10) +  
     ' END' + CHAR(13)+ char(10) + 
     ' ELSE' + CHAR(13)+ char(10) + 
     ' BEGIN' + CHAR(13)+ char(10) + 
     ' -- update statement' + CHAR(13)+ char(10) +  
     ' UPDATE ' + @SchemaName + '.' + @TableName + CHAR(13)+ char(10) +  
     ' SET ' + @update + CHAR(13)+ char(10) + 
     ' WHERE ' + @ID + '= @' + @ID + CHAR(13)+ char(10) +  
     ' END' + CHAR(13)+ char(10) + 
    ' SELECT @@Rowcount' + CHAR(13)+ char(10) +  
     'END TRY' + CHAR(13)+ char(10) +  
     'BEGIN CATCH' + CHAR(13)+ char(10) +  
    ' SELECT Cast(ERROR_NUMBER() as varchar(10)) + '':'' + ERROR_MESSAGE()' + CHAR(13)+ char(10) +  
    'END CATCH' + CHAR(13)+ char(10) +  
    'END' ) 
END 
 
    The stored procedure template is slightly modified from the one presented in the previous posts, this time preferring to return the ID in the same parameter used as input, using the select to return the number of records affected. Here’s the test script which creates the stored procedure and its output:

SELECT dbo.TemplateSave( 
[Schema_Name] , Table_Name  
, id  
, id_data_type  
, [parameters]  
, [values]  
, [attributes]  
, [update]) CodeSP 
FROM dbo.vCRUDMetadata 
WHERE [Schema_Name] = 'Person' 
AND Table_Name = 'Address' 

CREATE PROCEDURE Person.pSaveAddress( 
@AddressID int output 
, @AddressLine1 nvarchar (120) 
, @AddressLine2 nvarchar (120) 
, @City nvarchar (60) 
, @StateProvinceID int 
, @PostalCode nvarchar (30) 
, @rowguid uniqueidentifier 
, @ModifiedDate datetime) 
AS 
BEGIN 
BEGIN TRY 
IF ISNULL(@AddressID, 0) = 0 
BEGIN 
-- insert statement 
     INSERT INTO Person.Address ( AddressLine1 
     , AddressLine2 
     , City 
     , StateProvinceID 
     , PostalCode 
     , rowguid 
     , ModifiedDate) 
    VALUES ( @AddressLine1 
     , @AddressLine2 
     , @City 
     , @StateProvinceID 
     , @PostalCode 
     , IsNull(@rowguid, NEWID()) 
     , @ModifiedDate) 
    SET @AddressID = @@IDENTITY 
END 
ELSE 
BEGIN 
     -- update statement 
    UPDATE Person.Address 
    SET AddressLine1 = @AddressLine1 
    , AddressLine2 = @AddressLine2 
    , City = @City 
    , StateProvinceID = @StateProvinceID 
    , PostalCode = @PostalCode 
    , rowguid = IsNull(@rowguid, NEWID()) 
    , ModifiedDate = @ModifiedDate 
     WHERE AddressID= @AddressID 
END 
SELECT @@Rowcount 
END TRY BEGIN CATCH 
     SELECT Cast(ERROR_NUMBER() as varchar(10)) + ':' + ERROR_MESSAGE() 
END CATCH 
END 
 
  The resulted code could be exported to an Excel or text file and then run in Management Studio. And here are two calls to it, one for update, and the other one for insert:

EXEC Person.pSaveAddress 1, 'AddressLine1', 'AddressLine2', 'City', 1 , 'PostalCode', NULL, '20090101' 
EXEC Person.pSaveAddress 0, 'AddressLine100', 'AddressLine2', 'City', 1 , 'PostalCode', '6D1EB969-C5CF-420C-BA5F-1153F4D89373', '20090101' 

    The difference from the Excel version is that it could be run easily for a set of tables and then the stored procedures could be created one by one. A higher level of automation could be achieved by creating a cursor or an SSIS package, and then each stored procedure created with EXEC or sp_executesql stored procedures. The template could be modified after requirements as needed. Eventually the code from template could be included in a CLR function and thus take advantage of some string functionality, but it becomes then more difficult to modify
Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 25 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.