Showing posts with label GBU. Show all posts
Showing posts with label GBU. Show all posts

10 December 2023

💫🧮☯ERP: Microsoft Dynamics 365's Invoice Capture (The Good, the Bad and the Ugly)

Enterprise Resource Planning
ERP Systems

At the last meeting of the Microsoft Dynamics Meetup Germany group there were about 20 Microsoft experts invited to expose in two minutes their favorite new feature from the Microsoft ecosystem. None of them though mentioned Invoice Capture, which I think deserves its place on the list. 

Invoice Capture is a Power Apps-based application deeply integrated with Dynamics 365 which allows the semi-automatic processing of Vendor invoices received over the various channels (Outlook, SharePoint, OneDrive) or via manual upload. The Power App listens on the configured channels, imports the documents as they arrive and uses optical character recognition capabilities to extract the standard textual information needed to create a Vendor invoice record with header, lines and further information. In a first phase the Accountant Clerk classifies, reviews, corrects and transfers the Invoice to Dynamics 365, from where the Invoice follows the standard process being enriched, posted to the Subledger and further booked to General Ledger. Of course, several changes were done also in Dynamics 365, especially in what concerns the parametrization and Invoices' automatic processing.

The Good: Thus, Invoice Capture attempts to provide end-to-end invoice automation and probably with further changes to cover at least the most common scenarios it will be able to do so. Since it was released as a minimal viable product (MVP), besides bug fixing several features were added - the search of Vendors and their automatic synchronization, the entry of Cost Center and Department Code financial dimensions upfront in the Power App, the support for multiple tax codes and for custom fields, just to mentioned the most important features. However, more changes are needed to provide customers more flexibility in automating the process and in handling other complex scenarios. 

Through automation and further features like the continuous learning from manual input and previous value retention, Invoice Capture decreases the volume of manual work and increases the financial cycle time, making the overall process more efficient. Moreover, the invoices are available almost as soon as they came into Dynamics 365, allowing better overview and thus better spend control. Features like Invoice approval via workflows and extrinsic automating features can offer further opportunities for improvement. Last but not the least, Invoice Capture allows achieving a paperless AP, helping organizations' effort on their road to digital transformation. 

The Bad: It's natural in Software Development to start with a MVP and built upon, however the gap between the MVP and what customers need involves certain challenges when evaluating and implementing the feature(s). Some hiccups are inherent as a piece of software needs time to stabilize and mature, however with better transparency and communication about the roadmap the respective processes would have been a better experience. On the other side, Microsoft was quite helpful in the process, welcoming the feedback and integrating it in the plan, and in time even provided more transparency. However, there seem to be still many unknowns, especially in what concerns the integration with old and new features from the roadmap (e,g. e-Invoicing, recurring Vendor invoices).

The truth is that customers have different needs, their processes have degrees of complexity that may go way beyond the features provided by the MVP and subsequent versions. Some customers were happy with the MVP, some had to compromise while others maybe went for alternatives. 

The Ugly: It's time consuming to evaluate and implement a new feature, to fill the gaps and find alternatives, especially when the organizational setup is not optimal. However all these are normal challenges from the life of an ERP consultant.

Despite the current and maybe future challenges, Invoice Capture can become in time an important product on the Microsoft roadmap.

Previous Post <<||>> Next Post

27 December 2020

🧊☯Data Warehousing: Data Vault 2.0 (The Good, the Bad and the Ugly)

Data Warehousing
Data Warehousing Series

One of the interesting concepts that seems to gain adepts in Data Warehousing is the Data Vault – a methodology, architecture and implementation for Data Warehouses (DWH) developed by Dan Linstedt between 1990 and 2000, and evolved into an open standard with the 2.0 version.

According to its creator, the Data Vault is a detail-oriented, historical tracking and uniquely linked set of normalized tables that support one or more business functional areas [2]. To hold data at the lowest grain of detail from the source system(s) and track the changes occurred in the data, it splits the fact and dimension tables into hubs (business keys), links (the relationships between business keys), satellites (descriptions of the business keys), and reference (dropdown values) tables [3], while adopting a hybrid approach between 3rd normal form and star schemas. In addition, it provides a two- or three-layered data integration architecture, a series of standards, methods and best practices supposed to facilitate its use.

It integrates several other methodologies that allow bridging the gap between the technical, logistic and execution parts of the DWH life-cycle – the PMI methodology is used for the various levels of planning and execution, while the Scrum methodology is used for coordinating the day-to-day project tasks. Six Sigma is used together with Total Quality Management for the design and continuous improvement of DWH and data-related processes. In addition, it follows the CMMI maturity model for providing a clear baseline for benchmarking an organization’s DWH capabilities in development, acquisition and service areas.

The Good: The decomposition of the source data models into hub, link and satellite tables provides traceability and auditability at raw data level, allowing thus to address the compliance requirements of Sarabanes-Oxley, HIPPA and Basel II by design.

The considered standards, methods, principles and best practices are leveraged from Software Engineering [1], establishing common ground and a standardized approach to DWH design, implementation and testing. It also narrows down the learning and implementation paths, while allowing an incremental approach to the various phases.

Data Vault 2.0 offers support for real-time, near-real-time and unstructured data, while new technologies like MapReduce, NoSQL can be integrated within its architecture, though the same can be said about other approaches as long there’s compatibility between the considered technologies. In fact, except business entities’ decomposition, many of the notions used are common to DWH design.

The Bad: Further decomposing the fact and dimension tables can impact the performance of the queries run against the tables as more joins are required to gather the data from the various tables. The further denormalization of tables can lead to higher data storage needs, though this can be neglectable compared with the volume of additional objects that need to be created in DWH. For an ERP system with a few hundred of meaningful tables the complexity can become overwhelming.

Unless one uses a COTS tool which automates some part of the design and creation process, building everything from scratch can be time-consuming, increasing thus the time-to-market for solutions. However, the COTS tools can introduce restrictions of their own, which can negatively impact the overall experience with the methodology.

The incorporation of non-technical methodologies can have positive impact, though unless one has experience with the respective methodologies, the disadvantages can easily overshadow the (theoretical) advantages.

The Ugly: The dangers of using Data Vault can be corroborated as usual with the poor understanding of the methodology, poor level of skillset or the attempt of implementing the methodology without allowing some flexibility when required. Unless one knows what he is doing, bringing more complexity in a field which is already complex, can easily impact negatively projects’ outcomes.

Previous Post <<||>> Next Post

References:
[1] Dan Linstedt & Michael Olschimke (2015) Building a Scalable Data Warehouse with Data Vault 2.0
[2] Dan Linstedt (?) Data Vault Basics [source]
[3] Dan Linstedt (2018) Data Vault: Data Modeling Specification v 2.0.2 [source]

13 June 2020

🧭☯Business Intelligence: Self-Service BI (The Good, the Bad and the Ugly)

Business Intelligence

Self-Service BI
(SSBI) is a form of Business Intelligence (BI) in which the users are enabled and empowered to explore and analyze the data, respectively build reports and visualizations on their own, with minimal IT support. 

The Good: Modern SSBI tools like PowerBI, Tableau or Qlik Sense provide easy to use and rich functionality for data preparation, exploration, discovery, integration, modelling, visualization, and analysis. Moreover, they integrated the advances made in graphics, data storage and processing (e.g. in-memory processing, parallel processing), which allow addressing most of data requirements. With just a few drag-and-drops users can display details, aggregate data, identify trends and correlations between data. Slice-and-dice or passthrough features allow navigating the data across dimensions and different levels of details. In addition, the tools can leverage the existing data models available in data warehouses, data marts and other types of data repositories, including the rich set of open data available on the web.

With the right infrastructure, knowledge and skills users can better understand and harness the business data, using them to address business questions, they can make faster and smarter decisions rooted in data. SSBI offers the potential of increasing the value data have for the organization, while improving the time to value for data products (data models, reports, visualizations). 

The Bad: In the 90s products like MS Excel or Access allowed users to build personal solutions to address gaps existing in processes and reporting. Upon case, the personal solutions gained in importance, starting to be used by more users to the degree that they become essential for the business. Thus, these islands of data and knowledge started to become a nightmare for the IT department, as they were supposed to be kept alike and backed-up. In addition, issues like security of data, inefficient data processing, duplication of data and effort, different versions of truth, urged the business to consolidate such solutions in standardized solutions. 

Without an adequate strategy and a certain control over the outcomes of the SSBI initiatives, organization risk of reaching to the same deplorable state, with SSBI initiatives having the potential to bring more damage than the issues they can solve. Insufficient data quality and integration, unrealistic expectations, the communication problems between business and IT, as well insufficient training and support have the potential of making SSBI’s adoption more difficult.

The investment in adequate SSBI tool(s) might be small compared with the further changes that need to be done within the technical and logistical BI infrastructure. In addition, even if the role of IT is minimized, it doesn’t mean that IT needs to be left out of the picture. IT is still the owner of the IT infrastructure, it still needs to oversight the self-service processes and the flow of data, information and knowledge within the organization. From infrastructure to skillset, there are aspects of the SSBI that need to be addressed accordingly. The BI professional can’t be replaced entirely, though the scope of his work may shift to address new types of challenges.

Not understanding that SSBI initiatives are iterative, explorative in nature and require time to bring value, can put unnecessary pressure on those being part of it. Renouncing to SSBI initiatives without attempting to address the issues and stir them in the right direction hinder an organization and its employees’ potential to grow, with all the implication deriving from it.

The Ugly: Despite the benefits SSBI can bring, its adoption within organizations remains low. Whether it’s business’ credibility in own forces, or the inherent technical or logistical challenges, SSBI follows the BI trend of being a promise that seldom reaches its potential.

11 June 2020

🧭🪄☯Business Intelligence: SQL Server Reporting Services (The Good, the Bad and the Ugly)

Business Intelligence

SQL Server Reporting Services (SSRS) is the oldest solution from the modern Microsoft BI stack. Released as add-on to SQL Server 2000, it allows covering most of an organization's reporting requirements, either if we talk about tables, matrices or crosstab displays, raw data, aggregations, KPIs or visualizations like charts, gauges, sparklines, tree maps or sunbursts.

The Good: Once you have a SQL query based on any standard data sources (SQL Server, Oracle, SharePoint, OData, XML, etc.), it can be used in just a few minutes to create a report with the help of a wizard. Sure, adding the needed formatting, parameters, custom code, drilldown and drill-through functionality might take some effort, though in less than an hour you have a running report. The use of templates and a custom branding allows providing a common experience across the enterprise. 

The whole service is available once you have a SQL Server license, fact that makes from the SSRS a cost-effective tool. The shallow learning curve and the integration with SharePoint facilitates the development and consumption of reports.

With its pixel-accurate display of data, SSRS is ideal for printing business documents. This was probably one of the reasons why SSRS become with Microsoft Dynamics AX 2009 also the main reporting platform for the further versions. One can use an AX 2009 class as source for the report, or directly use the base tables, which can increase reports’ performance in the detriment of reengineering the logic from AX 2009. With a few exceptions in finance area the reporting logic is easy to build.  

With SQL Server 2016 it got a HTML5 rendering engine, while with SSRS 2017 it supports a responsive web design. The integration of the SSRS and Power BI environments has the chance to further extend the value provided by this powerful combination, however it depends also in which direction Microsoft will develop this idea.   

The Bad: One of the important downsides of SSRS is that it doesn’t allow custom authentication. Even if some examples exist on the Web, it’s hard to understand Microsoft’s stubbornness of not providing this by design. 

Because SSRS still uses an older MS Office driver, it allows exporting only 65536 records to Excel, fact that makes data consumption more complicated. In addition, the pixel-perfect isn’t that perfect, the introduction of empty columns when exporting to Excel, adds some unnecessary burden.

In total, the progress made by SSRS between the various releases is small when compared with the changes suffered by SQL Server. Even if the visualization capabilities cover most of the requests, it loses field when compared with Power BI and similar visualization tools. 

The Ugly: SSRS, as the typical BI developer knows it, is different than the architecture frameworks provided when working with Business Central, respectively Dynamics 365 and CRM. Even if there are maybe entitled reasons, Microsoft failed to unite the three architectures into a flexible solution. Almost all the examples available on the Web target CRM, and frankly it’s hard to understand that. It feels like Microsoft wants to sabotage their own product?! What’s hard to understand is that besides SSRS and Power BI Microsoft has several other reporting tools for Dynamics 365. Building reports for Business Central or Dynamics 365 requires certain skills, while the development time increased considerably, thus SSRS losing from the appeal it previously had, allowing other tools to join the landscape (e.g. electronic documents).

SSRS can’t be smoothly integrated with Office 365 Online, remaining mainly a solution for on-premise architectures.  This can become a bottleneck when the customers move to the cloud, the BI strategy needing to be eventually rethought as well. 

24 May 2020

🧊🎡☯Data Warehousing: SQL Server Integration Services (The Good, the Bad and the Ugly)

Data Warehousing

Microsoft SQL Server Integration Services (SSIS) is a platform for building (enterprise-level) data integrations and data transformations solutions by using a rich set of built-in tasks and transformations, graphical tools for building packages, respectively a catalog for storing the packages. Formally called Data Transformation Services (DTS), it was introduced with SQL Server 2000 and with SQL Server 2005 it was rebranded as SSIS.

The Good: Since its introduction it was adopted by DBAs and (database) programmers because it allowed the import and export of data on the fly from and to SQL Server, flat files, other relational data sources, in fact any resource exposing a driver for ODBC or OLEDB libraries. The extract/load functionality was extended by a basic set of transformations, making from DTS the ideal ETL tool for data warehousing and integrations. The data from multiple sources and targets could be processed in parallel or sequentially, the ETL logic being encapsulated in one or more packages that could be run manually or scheduled via the SQL Server agent flexibly.

With SQL Server 2005 and further versions the SSIS framework was extended to support further data sources including XML, CAML-based SharePoint lists, OData, Hadoop or Azure Bloob. It allowed the storage of packages on the local storage or within the built-in catalog.

One could thus develop rich ETL functionality without writing a single line of code. In theory the packages could be run and modified also by non-IT users, which can be a plus in certain scenarios. On the other side one could build custom packages programmatically from the beginning, and thus extend the available data processing logic as seemed fit, being able to using existing code and whole libraries embedded into the packages or run via dlls calls .

The Bad: Despite the rich functionality, a data pipeline usually has a lower performance and is more difficult to troubleshoot compared with the built-in RDBMS functionality for data processing. Most, if not all transformations can be handled over SQL-based queries more efficiently as long the data are available on the same SQL Server instance. In addition, SQL provides better code reuse, maintainability, chances for refactoring, scalability and the solutions are easier to deploy. Therefore, one practice resumes in using SSIS only for import/export, the further logic being encapsulated into stored procedures and further database objects. This isn’t necessarily bad, on contrary, though specific expertise is needed then to modify the code.

The Ugly: SSIS is in general suitable for data warehousing and integrations solutions whose logic is ideally stable and well-defined. Therefore, SSIS is less suitable for ERP data migrations or similar task which at least at the beginning have an exploratory nature and an overwhelming complexity, multiple iterations being needed before the requirements were fully identified and understood. In extremis each iteration can involve a redesign, which can prove to be time-consuming. One could in theory attempt first understanding all the data, though this could mean starting the development late in the process, while the data for testing are required much earlier. One can still use SSIS for specific tasks, though implementing a whole solution could imply certain challenges that otherwise could have been avoided.

SSIS is not suitable for real-time complex data integrations which require the processing of a considerable amount of data, when specific architectures like SOA, Restful calls or other solution could be more efficient. When not adequately implemented a data integration can lead to more problems than it can solve. Best example is the increase in execution time with the volume of data, fact that can easily lead to time-outs and locking of data.

01 February 2020

#️⃣☯Software Engineering: Concept Documents (The Good, the Bad and the Ugly)

Software Engineering

A concept document (simply a concept) is a document that describes at high level the set of necessary steps and their implications in order to achieve a desired result, typically making the object of a project. In other words, it describes how something can be done or achieved, respectively how a problem can be solved.

The GoodThe main aim of the document is to give all the important aspects and to assure that the idea is worthy of consideration, that the steps considered provide a good basis for further work, respectively to provide a good understanding for the various parties involved, Therefore, concepts are used as a basis for the sign-off, respectively for the implementation of software and hardware solutions.

 A concept provides information about the context, design, architecture, security, usage, purpose and/or objectives of the future solution together with the set of assumptions, constraints and implications. A concept is not necessarily a recipe because it attempts providing a solution for a given problem or situation that needs a solution. Even if it bears many similarities in content and structure a concept it also not a strategy, because the strategy offers an interpretation of the problem, and also not a business case, because the later focuses mainly on the financial aspects.

A concept proves thus to be a good basis for implementing the described solution, being often an important enabler. On the other side, a written concept is not always necessary, even if conceptualization must exist in implementers’ head.

The Bad: From these considerations projects often consider the elaboration of a concept before further work can be attempted. To write such a document is needed to understand the problem/situation and be capable of sketching a solution in which the various steps or components fit together as the pieces of a puzzle. The problem is that the more complex the problem to be solved, the fuzzier the view and understanding of the various pieces becomes, respectively, the more challenging it becomes to fit the pieces together. In certain situations, it becomes almost impossible for a single person to understand and handle all the pieces. Solving the puzzle becomes a collective approach where the complexity is broken in manageable parts in the detriment of other aspects.

Writing a concept is a time-consuming task. The more accuracy and details are needed, the longer it takes to write and review the document, time that’s usually stolen from other project phases, especially when the phases are considered as sequential. It takes about 20% from the total effort needed to write a ‘perfect’ concept for writing a concept that covers only 80% of the facts, while 80% from the effort to consider the remaining 20% of the facts as the later involve multiple iterations. In extremis, aiming for perfection will make one start the implementation late or not start at all. It’s a not understandable pedantry with an important impact on projects'
 timeline and quality in the hope of a quality increase, which is sometimes even illusory.

The Ugly: The concept-based approach is brought to extreme in ERP implementations where for each process or business area is needed to write a concept, which often carries fancy names – solution design document, technical design document, business process document, etc. Independently how it is called, the purpose is to describe how the solution is implemented. The problem is that the conceptualization phase tends to take much longer than planned given the dependencies between the various business area in terms of functionality and activities. The complexity can become overwhelming, with an important impact on project’s budget, time and quality.

31 January 2020

💫🧮☯ERP: Microsoft Dynamics 365 (The Good, the Bad and the Ugly)

ERP Implementation

The Good: The shift made by Microsoft by porting their Dynamics AX ERP solution to a web-based application (aka D365) hosted in the Microsoft cloud, offered them a boost on the ERP market. The integration with the Office and BI stack, as well Microsoft’s One Version strategy of pushing new features almost on a monthly basis, and of having customers at a maximum 2 releases from the current version, makes from D365 a solution to consider for small to big organizations that span over business sectors and geographies.

The Bad: Currently the monthly release cycle seems to be a challenge for the customers and service providers altogether. Even if the changes in existing functionality are minor, while the functionality is thoroughly tested before releases, the customers still need to test the releases in several systems, especially to assure that the customizations and integrations still work. This can prove to be quite a challenge in which automatic or semiautomatic tools can help when adequately used. Even then, a considerable effort needs to be addressed by the parties involved.
The burden is bigger for the service providers that build their own solutions for D365 as they need to assure in advance that after each release the applications still work. From customers’ perspective, the more such applications they use, the higher the risks of delays in adopting a release or, in extremis, to look for similar solutions. In theory, with good planning and by following best practices the risks are small, though that’s just the theory speaking.
If in the past 2-3 instances were enough to support the ERP during and post implementation, currently the requirements for the cloud-based solution more than doubled, an organization arriving to rent 5-7 D365 instances for the same. Moreover, even if the split between the main blocks (Finance, Supply Chain, Retail and Talent), plus the various Customer Engagement packages, provides some flexibility when thy are combined, this leads to a considerable price increase. Further costs are related to the gaps existing in the available functionality. More likely Microsoft will attempt closing some of the gaps, however until then the customers are forced to opt for existing solutions or have the functionality built. Microsoft pretends that their cloud-based ERP solution provides lower ownership costs, however, looking at the prices, it’s questionable on whether D365 is affordable for small and average organizations. To put it bluntly – think how many socks (aka products) one needs to sell just to cover the implementation, the licensing and infrastructure costs!
One important decision taken by Microsoft was to not allow the direct access to the D365 production database, decision that limits an organization’s choices and flexibility in addressing reporting requirements. Of course, the existing BI infrastructure can still be leveraged with a few workarounds, though the flexibility is lost, while further challenges are involved.
The Ugly: ERP implementations based on D365 make no exceptions from the general trend – given their complexity they are predisposed to fail achieving the set objectives, and this despite Microsoft’s attempts of creating methodologies, tools and strong communities to support the service providers and customers in such projects. The reasons for failure reside with the customers and service providers altogether, the chains of implications forming a complex network of causalities with multiple levels of reinforcement. When the negative reinforcements break the balance, it can result a tipping point where the things start to go wrong – escalations, finger-pointing, teams’ restructuring, litigations, etc. In extremis, even if the project reaches the finish, the costs can easily reach an overrun of 50-150% from the initial estimation, and that’s a lot to bear.

30 January 2020

💼☯Project Management: Methodologies (The Good, the Bad and the Ugly)

Mismanagement

The Good
: Nowadays there're several Project Management (PM) methodologies to choose from to address a project’s specifics and, when adapted and applied accordingly, a methodology can enable projects to be run and brought under control.

The Bad: Even if the theoretical basis of PM methodologies has been proved and perfected over the years, projects continue to fail at a disturbing rate. Of course, the reasons behind their failure are multiple, though often the failure reasons are rooted in how PM methodologies are taught, understood and implemented.

Same as a theoretical course in cooking won’t make one a good cook, a theoretical course in PM won’t make one a good Project Manager or knowledgeable team member in applying the learned methodology. Surprisingly, the expectation is exactly that – the team member got a training and is good to go. Moreover, people believe that managing a software project is like coordinating the building of a small treehouse. To some degree there are many similarities though the challenges typically lie in details, and these details often escape a standard course.

To bridge the gap between theory and practice is needed time for the learner to grow in the role, to learn the does and don’ts, and, most important, to learn how to use the tools at hand efficiently. The methodology is itself a tool making use of further tools in its processes – project plans, work breakdown structures, checklists, charters, reports, records, etc. These can be learned only through practice, hopefully with some help (aka mentoring) from an experienced person in the respective methodology, either the Project Manager itself, a trainer or other team member. Same as one can’t be thrown into the water and expected to traverse the Channel Tunnel, you can’t do that with a newbie.

There’s a natural fallacy to think that we've understood more than we have. We can observe our understanding's limits when we are confronted with the complexities involved in handing PM activities. A second fallacy is not believing other people’s warnings against using a tool or performing an activity in a certain way. A newbie’s mind has sometimes the predisposition of a child to try touching a hot stove even if warned against it. It’s part of the learning process, though some persist in such behavior without learning much. What’s even more dangerous is a newbie pretending to be an expert and this almost always ends badly.

The Ugly appears when the bad is brought to extreme, when methodologies are misused for the wrong purposes to the degree that they destroy anything in their way. Of course, a pool can be dug by using a spoon but does it make sense to do that? Just because a tool can be used for something it doesn’t mean it should be used for it as long there are better tools for the same. It seems a pretty logical thing though the contrary happens more often than we’d like. It starts with the preconception that one should use the tool one knows best, ignoring in the process the fit for purpose condition. What’s even more deplorable is breaking down a project to fit a methodology while ignoring the technical and logistical aspects.

Any tool can lead to damages when used excessively, in wrong places, at the wrong point in time or by the wrong person. Like the instruments in an orchestra, when an instrument plays the wrong note, it dissonates from the rest. When more instruments play wrongly, then the piece is unrecognizable. It’s the role of the bandmaster to make the players touch the right notes at the right time.

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.