13 January 2021

💎💫SQL Reloaded: D365FO Security Queries

The security architecture of Microsoft Dynamics 365 for Finance and Operations (D365FO) is based on a role-based model in which the access is not granted individually to users but through security roles. A set of roles are assigned to a user, each role having access to a set of privileges. In between duties can be assigned to one or more roles, respectively duties can contain several privileges. The model comes with a default set of security roles, which can be further extended to reflect organization's needs. 

Navigating through the model via the D365FO's UI isn't that straightforward as it should be. Probably it's much easier to export to Excel the associations between roles and privileges, respectively between roles and duties, or search punctually for a certain value within the same dataset directly in the database. The following queries can be run into a non-production environment:

-- security role's assigned privileges
SELECT SRO.AOTName [Role AOT Name]
, SRO.Name [Role Name]
, SRO.Description [Role Description]
, SPI.Identifier [Privilege Identifier]
, SPI.Name [Privilege Name]
, SPI.Description [Privilege Description]
FROM dbo.SecurityRolePrivilegeExplodedGraph SRP
     JOIN dbo.SecurityRole SRO
          ON SRP.SecurityRole = SRO.RecId
         JOIN dbo.SecurityPrivilege SPI
           ON SRP.SecurityPrivilege = SPI.RecId
WHERE SPI.NAME IN ('Maintain accounts receivable aging period definitions'
, 'Maintain accounts payable aging period definitions')
ORDER BY SRO.AOTName


-- security role's assigned duties
SELECT  SRO.AOTName [Role AOT Name]
, SRO.Name [Role Name]
, SRO.Description [Role Description]
, SDU.Identifier [Duty Identifier]
, SDU.Name [Duty Name]
, SDU.Description [Duty Description]
FROM dbo.SecurityRoleDutyExplodedGraph SRP
     JOIN dbo.SecurityRole SRO
          ON SRP.SecurityRole = SRO.RecId
        JOIN dbo.SecurityDuty SDU
          ON SRP.SecurityDuty = SDU.RecId
WHERE SRO.Name LIKE 'Accounting Manager%'
ORDER BY SDU.Identifier 

Below you can find a short description of the security tables considered above:

Table Description
SecurityDuty contains the list of duties defined by the security AOT role node
SecurityPrivilege contains the list of privileges defined by the security AOT role node
SecurityRole contains the list of roles defined by the security AOT role node
SecurityRoleDutyExplodedGraph contains the list of role to duty mappings and role to privilege mappings as defined by the AOT security role
SecurityRoleExplodedGraph contains all role relationships, direct or indirect, as defined by the AOT sub role nodes of the security role nodes.
SecurityRolePrivilegeExplodedGraph contains the list of role to privilege mappings and role to privilege mappings as defined by the AOT security role
SecurityUserRole contains the user to role mappings

Previous Post <<||>> Next Post

Resources:
[1] Microsoft Dynamics 365 (2020) Security architecture [source]
[2] Microsoft Dynamics 365 (2020) Role-based security [source]

💎💫SQL Reloaded: Useful Queries for D365 FO (and not only)

Microsoft Dynamics 365 for Finance and Operations (D365 FO or simply D365) has currently more than 12000 tables, therefore attempting to understand the data model behind might seem at first sight to be a foolhardy effort. Fortunately, only a small percentage from all the tables contain meaningful master or transactional data, fact that makes easier the task. 

Moreover, the tables have meaningful prefixes (e.g. 'Invent' for Inventory, 'Purch' for Purchae Orders, 'Sales' for Sales Orders, etc.) and postfixes (e.g. 'Table for header data, 'Line' for position data, 'Trans' for transaction data) while their names are in many cases self-explanatory if one knows the corresponding business terms.

For those coming from earlier versions of D365 (e.g. Microsoft Dynamics AX 2009), with a few exceptions in the inventory transactions, product master data, addresses and finance dimensions, the data structures remained almost the same, if one considers the main functional area of the ERP system. However as new modules and/or functionality were added, the number of new tables increased considerably between the versions. 

On the other side in D365 the data are exposed through data entities, abstractions from the physical implementation of the database tables [1], implemented as views typically having the 'Entity' postfix. In many scenarios the data entities more likely will prove to be the best way of accessing the data, as they include an important part of the logic one will need anyway to reengineer, however there are maybe also exceptions in which is needed to work with the base tables. In the end it will be needed to know the main data entities as well the important tables related to them, being needed to check which views reference a given table, respectively which tables are references by a view. This can be easily achieved via the sys.sql_expression_dependencies system catalog view. 

The following queries can be run in non-production D365 environments:

-- retrieving the views where an object was used 
SELECT SCHEMA_NAME(obj.SCHEMA_ID) + '.'+ obj.name AS ReferencingObject
, obj.type_desc
, SCHEMA_NAME(obj.SCHEMA_ID) + '.'+ sed.referenced_entity_name AS ReferencedObject
, obr.type_desc ReferencedObjectType
FROM sys.all_objects obj
     JOIN sys.sql_expression_dependencies sed 
       ON obj.OBJECT_ID = sed.REFERENCING_ID 
	      JOIN sys.all_objects obr
		    ON sed.referenced_id = obr.object_id
WHERE sed.referenced_entity_name = 'PurchTable'
  AND obj.Type = 'V'
  AND obj.name LIKE 'Purch%'
ORDER BY ReferencingObject

Output (12 records):
ReferencingObject type_desc ReferencedObject ReferencedObjectType
dbo.PURCHASEORDERCUBE VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHASEORDERRESPONSEHEADERSUMMARY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHLINEBACKORDER VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHLINEBACKORDERWORKSPACE VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHLINEEXPANDED VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHORDERAPPROVEDVIEW VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPREPAYOPENBASE VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERCHARGEENTITY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERENTITY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERLINECDSENTITY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHTABLENOTARCHIVEDVERSIONS VIEW dbo.PURCHTABLE USER_TABLE

The inverse search: 

-- retrieving the objects referenced in a view
SELECT SCHEMA_NAME(obj.SCHEMA_ID) + '.'+ obj.name AS ReferencingObject
, obj.type_desc ReferncingObjectType
, SCHEMA_NAME(obj.SCHEMA_ID) + '.'+ sed.referenced_entity_name AS ReferencedObject
, obr.type_desc ReferencedObjectType
FROM sys.all_objects obj
     JOIN sys.sql_expression_dependencies sed 
       ON obj.OBJECT_ID = sed.referencing_id 
	      JOIN sys.all_objects obr
		    ON sed.referenced_id = obr.object_id
WHERE obj.name = 'PURCHPURCHASEORDERHEADERV2ENTITY'
ORDER BY ReferencedObject

Output:
ReferencingObject ReferncingObjectType ReferencedObject ReferencedObjectType
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.ACCOUNTINGDISTRIBUTIONTEMPLATE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.CONFIRMINGPO USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.DIMENSIONSETENTITY VIEW
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.GetValidFromInContextInfo SQL_SCALAR_FUNCTION
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.GetValidToInContextInfo SQL_SCALAR_FUNCTION
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.HCMWORKER USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.LOGISTICSPOSTALADDRESS USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.LOGISTICSPOSTALADDRESSBASEENTITY VIEW
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.PURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.PURCHTABLE_W USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.REASONTABLEREF USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.TMSPURCHTABLE USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.TRANSPORTATIONDOCUMENT USER_TABLE
dbo.PURCHPURCHASEORDERHEADERV2ENTITY VIEW dbo.VENDINVOICEDECLARATION_IS USER_TABLE

Notes:
1. There will be cases in which is needed to check views' definition. To understand the logic it will be needed to format the queries in a more readable form. This can take time, given that the joins and the further constraints are not always straightforward. Therefore, it might be a good idea to export a formatted copy of the view for later use.
2.  Unfortunately, the data model for D365 is not publicly available. One can use however the data model for AX 2012 (see [2] [3]) as basis and fill the gaps when needed. 
3. The important changes between the various data models versions are partially documented in several documents available via the Microsoft Dynamics CustomerSource. 
4. For non-production environments the data from views, tables or other database objects can be exported directly from the database. Unfortunately, Microsoft blocked the direct access to production environments. The data from tables can be accessed and exported via the table browser, while the data from entities can be exported to Excel typically from the display forms or via the Data Management Framework.
5. The queries can be used to explore any SQL Server database. The logic can be easily encapsulated in a view to simplify the calls. 

Previous Post <<||>> Next Post

References:
[1] Microsoft Dynamics 365 (2020) Data entities overview [source]
[2] Microsoft Docs (2014) Application tables [source]
[3] Microsoft Docs (2014) Base Enums AX 2012 [source]

💠🛠️SQL Server: Undocumented (Part IV: DBCC SQLPERF)

Besides the documented LOGSPACE parameter (see previous post), DBCC SQLPERF utility has several undocumented parameters which allow providing statistics about schedulers, threads, spinlocks, IO, network, read-aheads, respectively waits.  

Scheduler Statistics

By providing 'umsstats' as parameter, the utility returns as result the visible UMS schedulers on the system:

-- visible UMS schedulers
DBCC SQLPERF(umsstats)

Output:
StatisticValue
Node Id0
Avg Sched Load5
Sched Switches6903
Sched Pass6721358
IO Comp Passes11334
Scheduler ID0
    online1
    num tasks6
    num runnable0
    num workers9
    active workers6
    work queued0
    cntxt switches7125444
    cntxt switches(idle)9898903
    preemptive switches2304
Scheduler ID1
    online1
    num tasks6
    num runnable0
    num workers9
    active workers5
    work queued0
    cntxt switches3370432
    cntxt switches(idle)4427991
    preemptive switches22729

The fields have the following meaning: 
StatisticDescription
Node Id
Avg Sched Load
Sched SwitchesThe number of switches between schedulers
Sched Pass
IO Comp Passes
Scheduler IDThe scheduler's zero-based ID number
num tasksThe number of tasks associated with the scheduler
num runnableThe number of workers on the runnable list
num workersThe total number of workers associated with the scheduler
idle workersThe number of idle workers
work queuedThe number of items waiting to be processed in the work queue
cntxt switchesThe number of switches between workers for the scheduler
cntxt switches(idle)The number of times the idle loop was switched into

The functionality is useful when one suspects that there are issues related to the visible schedulers.

Detailed information about the schedulers can be found also via the sys.dm_os_schedulers DMV.

Threads Statistics

By providing 'threads' as parameter, the utility returns as result the threads created in the system:

--  thread statistics
DBCC SQLPERF(threads)

Output (just the first records):
SpidThread IDStatusLoginNameIOCPUMemUsage
111228backgroundsa000
212572backgroundsa000
312576backgroundsa000
411884backgroundsa000
512964backgroundsa000
612960backgroundsa004
712968backgroundsa000

Detailed information about the schedulers can be found also via the sys.dm_os_threads DMV.

IO Statistics

By providing 'iostats' as parameter, the utility returns as result a count of the outstanding reads, respectively writes:

--  IO statistics
DBCC SQLPERF(iostats)

Output:
StatisticValue
Reads Outstanding0
Writes Outstanding0

Network Statistics

By providing 'netstats' as parameter, the utility returns as result network-related statistics:

--  network statistics
DBCC SQLPERF(netstats)

Output:
StatisticValue
Network Reads6976
Network Writes9036
Network Bytes Read5318957
Network Bytes Written2,222512E+07
Command Queue Length0
Max Command Queue Length0
Worker Threads0
Max Worker Threads0
Network Threads0
Max Network Threads0

Read Ahead Statistics

By providing 'rastats' as parameter, the utility returns read-ahead statistics:

--  read ahead statistics
DBCC SQLPERF(rastats)

Output
StatisticValue
RA Pages Found in Cache0
RA Pages Placed in Cache0
RA Physical IO0
Used Slots0

Spinlock Statistics

By providing 'spinlockstats' as parameter, the utility returns the spinlock statistics, where a spinlock is a a lightweight synchronization object used to serialize access to data structures which are typically held for a short period of time:

--  spinlock statistics
DBCC SQLPERF(spinlockstats)

Output:
Spinlock NameCollisionsSpinsSpins/CollisionSleep Time (ms)Backoffs
LOCK_RW_TEST00000
LOCK_RW_SECURITY_CACHE711210500296,0619054
LOCK_RW_CMED_HASH_SET5175035001
LOCK_RW_ABTX_HASH_SET00000
LOCK_RW_RBIO_REQ00000

Detailed information about the spinlock stats can be found also via the sys.dm_os_spinlock_stats DMV.

Wait Statistics

By providing 'waitstats' as parameter, the utility returns the available wait statistics:

-- wait statistics 
DBCC SQLPERF(waitstats)

Output (only a few records):
Wait TypeRequestsWait TimeSignal Wait Time
LCK_M_SCH_M185973
LCK_M_S13168950
PAGEIOLATCH_SH17893856825
PAGEIOLATCH_UP1181010
PAGEIOLATCH_EX7361049016

Detailed information about the wait stats can be found also via sys.dm_os_wait_stats DMV.

Notes:
As Microsoft warns, the undocumented features shouldn't be used into production environments as they will be deprecated in future versions. Instead should be used the documented DMVs, when available. 
All objects mentioned above require VIEW SERVER STATE permissions.
The DBCC SQLPERF utility allows resetting the latch, spinlock, respectively the wait statistics by providing the following parameters (see the SQL Docs for more information): 

-- resetting the latch statistics
DBCC SQLPERF('sys.dm_os_wait_stats', CLEAR)

-- resetting the spinlock statistics
DBCC SQLPERF ('sys.dm_os_spinlock_stats', CLEAR);  

-- resetting the wait statistics
DBCC SQLPERF('sys.dm_os_latch_stats', CLEAR)

DBCC PERFMON provides in a single call the IO, network, read ahead, spinlocks, respectively the wait statistics in distinct resultsets:

-- performance statistics
DBCC PERFMON

🔏MS Office: Excel for SQL Developers V (Formatting Output to HTML)

Some years back I found a tool to format the SQL and VB.Net code in posts (see hilite.me), tool which made blogging much easier, as I didn't had to format the code manually myself. However, showing the output of the queries into blog posts resumed mainly to making screenshots, which is unproductive and wastes space from my quota. Therefore, today I took the time to create a small Excel macro which allows formatting an MS Excel range to a HTML table. The macro is pretty basic, looping though range's cells:

'formats a range as html table
Public Function GetTable(rng As Range)
  Dim retval As String
  Dim i, j As Long
  
  retval = "<table style=""width: 90%;color:black;border-color:black;font-size:10px;"" border=""0"" cellpadding=""1"">"
  
  For i = 1 To rng.Rows.Count()
    retval = retval & "<tr style=""background-color:" & IIf(i = 1, "#b0c4de", "white") & ";font-weight:" & IIf(i = 1, "bold", "normal") & """>"
    
    For j = 1 To rng.Columns.Count()
       retval = retval & "<td align=""" & IIf(IsNumeric(rng.Cells(i, j)), "right", "left") & """>" & rng.Cells(i, j) & "</td>"
    Next
    
    retval = retval & "</tr>" & vbCrLf
  Next
  
  retval = retval & "</table>"
  
  GetTable = retval
End Function

Just copy the GetTable macro into a new module in Excel and provide the range with data as parameter.

 Unfortunately, copying macro's output to a text file introduces two double quotes where just one was supposed to be:

This requires as intermediary step to replace the two double quotes with one in Notepad (e.g. via the Replace functionality), respectively to remove manually the first and last double quotes. 

Notes:
1. Feel free to use and improve the macro. 
2. Further formatting can be added afterwards as seems fit. 

Happy coding!

💠🛠️SQL Server: Administration (Monitoring the Database Logs)

One of the aspects to monitor on a SQL Server instance is the size of the logs available for each database, respectively the degree to which the logs are used. Starting with SQL Server 2005 this could be achieved by using the 'Log File(s) Used Size (KB)' and 'Log File(s) Size (KB)'  counters via the sys.dm_os_performance_counters DMV as follows:

-- log files - size (kb)
SELECT lfu.instance_name database_name
, lfu.cntr_value size_kb
, Cast(lfu.cntr_value/1024.00 as decimal (18,2)) size_MB
FROM sys.dm_os_performance_counters lfu 
WHERE lfu.counter_name LIKE  'Log File(s) Size (KB)%' 
  AND lfu.object_name LIKE 'SQLServer:Databases%'
  AND lfu.instance_name IN ('tempdb', 'master', 'model', 'msdb')
ORDER BY lfu.instance_name

-- log files - used size (kb)
SELECT lfs.instance_name database_name
, lfs.cntr_value used_size_kb
, Cast(lfs.cntr_value/1024.00 as decimal (18,2)) used_size_MB
FROM sys.dm_os_performance_counters lfs
WHERE lfs.counter_name LIKE  'Log File(s) Used Size (KB)%' 
  AND lfs.object_name LIKE 'SQLServer:Databases%'
  AND lfs.instance_name IN ('tempdb', 'master', 'model', 'msdb')
ORDER BY lfs.instance_name

The two queries can be combined into one as follows:

-- database log space allocation (SQL Server 2005+)
SELECT db.name database_name
, db.log_reuse_wait_desc 
, Cast(lfs.cntr_value/1024.00 as decimal(28,2)) size_MB
, Cast(lfu.cntr_value/1024.00 as decimal(28,2)) AS used_MB
, Cast(100.00*lfu.cntr_value/lfs.cntr_value as decimal(10,2)) used_percent 
, CASE WHEN CAST(lfu.cntr_value AS float) / CAST(lfs.cntr_value AS float) > .5 THEN 
   CASE 
    WHEN db.name = 'tempdb' AND log_reuse_wait_desc NOT IN ('CHECKPOINT', 'NOTHING') THEN 'WARNING'  
    WHEN db.name <> 'tempdb' THEN 'WARNING' 
    ELSE 'OK' 
    END 
  ELSE 'OK' END log_status 
FROM sys.databases db 
     JOIN sys.dm_os_performance_counters lfs 
       ON db.name = lfs.instance_name 
      AND lfs.counter_name LIKE 'Log File(s) Size (KB)%' 
     JOIN sys.dm_os_performance_counters lfu 
       ON db.name = lfu.instance_name 
      AND lfu.counter_name LIKE  'Log File(s) Used Size (KB)%' 
WHERE db.name IN ('tempdb', 'master', 'model', 'msdb')
ORDER BY db.name 

Output:
database_namelog_reuse_wait_descsize_MBused_MBused_percentlog_status
masterNOTHING1.991.1155.78WARNING
modelNOTHING7.991.7421.80OK
msdbNOTHING28.801.966.81OK
tempdbNOTHING999.990.690.07OK

Starting with SQL Server 2012 the same information can be obtained via the sys.dm_db_log_space_usage DMV, however the view returns information only for the current database:

-- getting the log space only for a database (SQL Server 2012+)
SELECT db_name(database_id) database_name 
, Cast(total_log_size_in_bytes/1024.00/1024.00 as decimal(28,2)) size_MB
, Cast(used_log_space_in_bytes/1024.00/1024.00 as decimal(28,2)) used_MB
, Cast(used_log_space_in_percent as decimal(28,2)) used_percent
FROM sys.dm_db_log_space_usage

Output:
database_namesize_MBused_MBused_percent
master1.991.1959.61

With less flexibility one can obtain the size in MB and the used percentage by using the DBCC utility as follows:

-- retrieving the log usage for all databases
DBCC SQLPERF(LOGSPACE); 

Output:
Database NameLog Size (MB)Log Space Used (%)Status
master1,99218833,137260
tempdb999,99220,063525890
model7,99218821,114370
msdb28,804696,6178460
AdventureWorks201433,9921914,766720
AdventureWorksDW201417,9921922,68780

Notes:
1. All the mentioned objects require VIEW SERVER STATE permissions.
2. The solution based on the performance counters returns slightly different values than the other solutions, though the differences are neglectable. 

Resources: 
[1] SQL Docs (2017) sys.dm_os_performance_counters [source]
[2] SQL Docs (2017) DBCC SQLPERF [source]
[3] SQL Docs (2017) sys.dm_db_log_space_usage [source]

09 January 2021

🧮ERP: Panning (Part I: It’s all about Planning - An Introduction)

ERP Implementation

Ideally the total volume of work can be uniformly distributed for all project’s duration though in praxis the curve representing the effort has the form of a wave or aggregation of waves that tend to reach the peak shortly before or during the Go-Live(s). More important, higher fluctuations occur in the various areas of the project on whole project’s duration, as there are dependencies between the various functional areas, as one needs to wait for decisions to be made, people are not available, etc. Typically, the time wasted on waiting, researching or other non-value-added activities have the potential of leading to such peaks. Therefore, the knowledge must be available, and decisions must be taken when required, which can be challenging but not impossible to achieve. 

To bridge the time wasted on waiting, the team members need to work on other topics. If on customer’s side the resources can handle maybe other activities, on vendor’s side the costs can be high and proportional with the volume of waiting. Therefore, vendor’s resources must be involved at least in two projects or do work in other areas in advance, which is not always possible. However, when vendor’s resources are involved in two or more projects, unless the planning is perfect or each resource can handle the work as it comes, there are further waiting times added. The customer is then forced either to book the resources exclusively, or to wait and carry the costs associated with it. 

On the other side ERP Implementations tend to become exploration projects, especially when the team has only partial knowledge about the system, or the requirements have a certain degree of specialization that deviates from the standard processes. The more unknowns an ERP implementation has, the more difficult is to plan. To be able to plan one must know the activities ahead, how long they take, and of course, one must adhere to the delivery dates, because each delay can have a cascading effect that can impact project’s schedule considerably. 

Probably the best approach to planning is to group the activities into packages and plan the packages, being in each subteam’s duty to handle the planning for each package, following to manage at upper level only the open issues, risks or opportunities. This shifts the burden from Project Manager’s shoulders within the project. Moreover, even if in theory the plan can consider each activity, it will become obsolete as soon it’s updated given the considerable volume of work requested to maintain it. Periodically, one can still revise the whole plan to identify opportunities and risks. What the team can do is to plan for a certain time interval (e.g. 4-6 weeks) and build from there. This allows focusing on the most important activities. 

To further shift the burden, activities like Data Migration, Data Cleaning or the integrations with third party systems should be treated when possible as subprojects. Despite having interdependencies with the main project (e.g. parameters, master data, decisions) and share same resources, they have their own schedule whose deadlines need to be aligned with main project’s milestones. 

Unless the team and business put all effort to respect the plan and, as long the plan is realistic, the initial plan can seldom be respected – it’s anyway just a sketch of the road ahead that can change as the project progresses – and this aspect needs to be understood by the business otherwise will lead to false expectations. On the other side, the team must try respecting the deadlines and communicate in time inability to do so. It’s an interplay in which communication is more important than ever.

Previous Post <<||>> Next Post


🧮ERP: Planning (Part I: It’s all about Partnership - An Introduction)

ERP Implementation
ERP Implementations Series

Unless the organization (customer) implementing an ERP system has a strong IT team and the knowledge required for the implementation is available already in-house, the resources need to be acquired from the market, and probably the right thing to do is to identify a certified implementer (partner) which can fill the knowledge and skillset gaps, respectively which can help splitting the risks associated with such an implementation.

In theory, the customer provides knowledge about its processes, while the partner comes with expertise about the system to be implemented and further technologies, industry best practices, project methodologies, etc. Further on, the mix is leveraged to harness the knowledge and reach project’s objectives. 

In praxis however finding an implementer which can act as partner might be more challenging than expected. This because the implementer needs to understand customer’s business and where it’s heading, bridge the gap between functional requirements and system’s functionality, advise on areas of improvement, prepare the customer for the project and lead the customer through the changes, respectively establish a basis for the future. Some of the implications are seldom made explicit even if they are implied by what is needed by the project. 

Technology is seldom the issue in an ERP implementation, the challenges residing in handing the change and the logistics required. There are so many aspects to be considered and handled, and this can be challenging for any implementer no matter how long has been on the market or how experienced the resources are. Somebody needs to lead the change and the customer seldom has the knowledge to handle the change. In some cases, the implementer must make the customer aware of the implications, while in others needs to take the initiative and lead the change, though the customer needs to play along, which can be challenging also. 

Many aspects need to be handled at management level from a strategical point of view on customer’s side. It starts with assuring that the most important aspects of the business where considered, that the goals and objectives are clear, that the proper environment is created, and ends with the timely decision-making, with assuring that the resources are available when needed, that the needed organization structures and roles are in place, that the required knowledge is available before, during and after implementation, that the potential brought by the ERP system is harnessed for the years to come. 

A partnership allows in theory splitting the implementation risks as ERP implementations have a high rate of failure. Quite often the outcomes of such projects don’t meet the expectations, the systems being in extremis unusable or a bottleneck for the organization. Ideally one should work with the partner(s) and attempt solving the issues, split eventually the incurred cost overruns, find a middle way. Most of the times it’s recommended to find a solution together rather than coming to a litigation. 

Given the complex dependencies existing between the various parts of the project, the causes that lead to poor implementations are difficult to prove, as there are almost always grey areas. Moreover, the litigations can require a considerable time and resources to settle. These can be just extreme situations, and as long one has a good partner, there’s no need to think that far. On the other side, even if undesirable, one must be prepared also for such outcomes, even if the countermeasures may involve an additional effort. Therefore, one must address such issues in contracts by establishing the areas of accountability/responsibilities for each party, document adequately the requirements and further (important) communication, make sure that the deliverables have the expected quality, etc.

Previous Post <<||>> Next Post

05 January 2021

🧮ERP: Planning (Part II: It’s all about Scope - Nonfunctional Requirements & MVP))

ERP Implementation

Nonfunctional Requirements

In contrast to functional requirements (FRs), nonfunctional requirements (NFRs) have no direct impact on system’s behavior, affecting end-users’ experience with the system, resuming thus to topics like performance, usability, reliability, compatibility, security, monitoring, maintainability, testability, respectively other constraints and quality attributes. Even if these requirements are in general addressed by design, the changes made to the system have the potential of impacting users’ experience negatively.  

Moreover, the NFRs are usually difficult to quantify, and probably that’s why they are seldom made explicit in a formal document or are considered eventually only at high level. However, one can still find a basis for comparison against compliance requirements, general guidelines, standards, best practices or the legacy system(s) (e.g. the performance should not be worse than in the legacy system, the volume of effort for carrying the various activities should not increase). Even if they can’t be adequately described, it’s recommended to list the NFRs in general terms in a formal document (e.g. implementation contract). Failing to do so can open or widen the risk exposure one has, especially when the system lacks important support in the respective areas. In addition, these requirements need to be considered during testing and sign-off as well. 

Minimum Viable Product (MVP)

Besides gaps’ consideration in respect to FRs, it’s important to consider sometimes on whether the whole functionality is mandatory, especially when considering the various activities that need to be carried out (parametrization, Data Migration).

For example, one can target to implement a minimum viable product (MVP) - a version of the product which has just enough features to cover the mandatory or the most important FRs. The MVP is based on the idea that implementing about 80% of the needed functionality has in theory the potential of providing earlier a usable product with a minimum of effort (quick wins), assure that project’s goals and objectives were met, respectively assure a basis for further development. In case of cost overruns, the MVP assures that the business has a workable product and has the opportunity of deciding whether it’s worth of investing more into the project now or later. 

The MVP allows also to get early users’ feedback and integrate it into further enhancements and developments. Often the users understand the capabilities of a system, respectively implementation, only when they are able using the system. As this is a learning process, the learning period can take up to a few months until adequate feedback is available. Therefore, postponing implementation’s continuation with a few months can have in theory a positive impact, however it can come also with drawbacks (e.g. the resources are not available anymore). 

A sketch of the MVP usually results from requirements’ prioritization, however then requirements need to be regarded holistically, as there can be different levels of dependencies existing between them. In addition, different costs can incur if the requirements will be handled later, and other constrains may apply as well. Considering an MVP approach can be a sword with two edges. In the worst-case scenario, the business will get only the MVP, with its good and bad characteristics. The business will be forced then to fill the gaps by working outside the system, which can lead to further effort and, in extremis, with poor acceptance of the system. In general, users expect having their processes fully implemented in the system, expectation which is not always economically grounded.

After establishing an MVP one can consider the further requirements (including improvement suggestions) based on a cost-benefit basis and implement them accordingly as part of a continuous improvement initiative, even if more time will be maybe required for implementing the same.

Previous Post <<||>> Next Post

🧮ERP: Planning (Part II: It’s all about Scope I - Functional Requirements)

ERP Implementation

Introduction

ERP (Enterprise Resource Planning) Implementations tend to be expensive projects, often the actual costs overrunning the expectations by an important factor. The causes for this are multiple, the most important ones ranging from the completeness and complexity of the requirements and the impact they have on the organization to the availability of internal and external skilled resources, project methodology, project implementation, organization’s maturity in running projects, etc

The most important decision in an ERP implementation is deciding what one needs, respectively what will be considered for the implementation, aspects reflected in a set of functional and nonfunctional requirements

Functional Requirements 

The functional requirements (FRs) reflect the expected behavior of the system in respect to the inputs and outputs – what the system must do. Typically, they encompass end-users’ requirements in the area of processes, interfaces and data processing, though are not limited to them. 

The FRs are important because they reflect the future behavior of the system as perceived by the business, serving further as basis for identifying project’s scope, the gaps between end-users’ requirements and system’s functionality, respectively for estimating project’s duration and areas of focus. Further they are used as basis for validating system’s behavior and getting the sign-off for the system. Therefore, the FRs need to have the adequate level of detail, be complete, clear, comprehensible and implementable, otherwise any gaps in requirements can impact the project in adverse ways. To achieve this state of art they need to go through several iterations in which the requirements are reevaluated, enhanced, checked for duplication, relevance or any other important aspect. In the process it makes sense to categorize the requirements and provide further metadata needed for their appraisal (e.g. process, procedure, owner, status, priority). 

Once brought close to a final form, the FRs are checked against the functionality available in the targeted system, or systems when more systems are considered for evaluation. Ideally all the requirements can be implemented with the proper parametrization of the systems, though it’s seldom the case as each business has certain specifics. The gaps need to be understood, their impact evaluated and decided whether the gaps need to be implemented. In general, it’s recommended to remain close to the standard functionality, as each further gap requires further changes to the system, gaps that in time can generate further quality-related and maintenance costs. 

It can become a tedious effort, as in the process an impact and cost-benefit analysis need to be performed for each gap. Therefore, gaps’ estimation needs to occur earlier or intermixed with their justification. Once the list of the FRs is finalized and frozen, they will be used for estimating the final costs of the project, identifying the work packages, respectively planning the further work.  Once the FRs frozen, any new requirements or changes to requirements (including taking out a requirement) need to go through the Change Management process and all the consequences deriving from it – additional effort, costs, delays, etc. This can trigger again an impact and cost-benefit analysis. 

The FRs are documented in a specification document (aka functional requirement specification), which is supposed to track all the FRs through their lifetime. When evaluating the FRs against system’s functionality it’s recommended to provide general information on how they will be implemented, respectively which system function(s) will be used for that purpose. Besides the fact that it provides transparence, the information can be used as basic ground for further discussions. 

Seldom all the FRs will be defined upfront or complete. Moreover, some requirements will become obsolete during project’s execution, or gaps will be downgraded as standard and vice-versa. Therefore, it’s important to recollect the unexpected.

Previous Post <<||>> Next Post

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.