Showing posts with label implementation. Show all posts
Showing posts with label implementation. Show all posts

01 September 2024

Data Management: Data Governance (Part I: No Guild of Heroes)

Data Management Series
Data Management Series

Data governance appeared around 1980s as topic though it gained popularity in early 2000s [1]. Twenty years later, organizations still miss the mark, respectively fail to understand and implement it in a consistent manner. As usual, the reasons for failure are multiple and they vary from misunderstanding what governance is all about to poor implementation of methodologies and inadequate management or leadership. 

Moreover, methodologies tend to idealize the various aspects and is not what organizations need, but pragmatism. For example, data governance is not about heroes and heroism [2], which can give the impression that heroic actions are involved and is not the case! Actions for the sake of action don’t necessarily lead to change by themselves. Organizations are in general good at creating meaningless action without results, especially when people preoccupy themselves, miss or ignore the mark. Big organizations are very good at generating actions without effects. 

People do talk to each other, though they try to solve their own problems and optimize their own areas without necessarily thinking about the bigger picture. The problem is not necessarily communication or the lack of depth into business issues, people do communicate, know the issues without a business impact assessment. The challenge is usually in convincing the upper management that the effort needs to be consolidated, supported, respectively the needed resources made available. 

Probably, one of the issues with data governance is the attempt of creating another structure in the organization focused on quality, which has the chances to fail, and unfortunately does fail. Many issues appear when the structure gains weight and it becomes a separate entity instead of being the backbone of organizations. 

As soon organizations separate the data governance from the key users, management and the other important decisional people in the organization, it takes a life of its own that has the chances to diverge from the initial construct. Then, organizations need "alignment" and probably other big words to coordinate the effort. Also such constructs can work but they are suboptimal because the forces will always pull in different directions.

Making each manager and the upper management responsible for governance is probably the way to go, though they’ll need the time for it. In theory, this can be achieved when many of the issues are solved at the lower level, when automation and further aspects allow them to supervise things, rather than hiding behind every issue. 

When too much mircomanagement is involved, people tend to busy themselves with topics rather than solve the issues they are confronted with. The actual actors need to be empowered to take decisions and optimize their work when needed. Kaizen, the philosophy of continuous improvement, proved itself that it works when applied correctly. They’ll need the knowledge, skills, time and support to do it though. One of the dangers is however that this becomes a full-time responsibility, which tends to create a separate entity again.

The challenge for organizations lies probably in the friction between where they are and what they must do to move forward toward the various objectives. Moving in small rapid steps is probably the way to go, though each person must be aware when something doesn’t work as expected and react. That’s probably the most important aspect. 

So, the more functions are created that diverge from the actual organization, the higher the chances for failure. Unfortunately, failure is visible in the later phases, and thus self-awareness, self-control and other similar “qualities” are needed, like small actors that keep the system in check and react whenever is needed. Ideally, the employees are the best resources to react whenever something doesn’t work as per design. 

Previous Post <<||>> Next Post 

Resources:
[1] Wikipedia (2023) Data Management [link]
[2] Tiankai Feng (2023) How to Turn Your Data Team Into Governance Heroes [link]


18 March 2024

Strategic Management: Strategy (Notes)

Disclaimer: This is work in progress intended to consolidate information from various sources. 
Last updated: 18-Mar-2024

Strategy

  • {definition} "the determination of the long-term goals and objectives of an enterprise, and the adoption of courses of action and the allocation of resources necessary for carrying out these goals" [4]
  • {goal} bring all tools and insights together to create an integrative narrative about what the  organization should do moving forward [1]
  • a good strategy emerges out of the values, opportunities and capabilities of the organization [1]
    • {characteristic} robust
    • {characteristic} flexible
    • {characteristic} needs to embrace the uncertainty and complexity of the world
    • {characteristic} fact-based and informed by research and analytics
    • {characteristic} testable
  • {concept} strategy analysis 
    • {definition} the assessment of an organization's current competitive position and the identification of future valuable competitive positions and how the firm plans to achieve them [1]
      • done from a general perspective
        • in terms of different functional elements within the organization [1]
        • in terms of being integrated across different concepts and tools and frameworks [1]
      • a good strategic analysis integrates various tools and frameworks that are in our strategist toolkit [1]
    • approachable in terms of 
      • dynamics
      • complexity
      • competition
    • {step} identify the mission and values of the organization
      • critical for understanding what the firm values and how it may influence where opportunities they look for and what actions they might be willing to take
    • {step} analyze the competitive environment
      • looking at what opportunities the environment provides, how are competitors likely to react
    • {step} analyze competitive positions
      • think about  own capabilities are and how they might relate to the opportunities that are available
    • {step} analyze and recommend strategic actions 
      • actions for future improvement
        • {question} how do we create more value?
        • {question} how can we improve our current competitive position?
        • {question} how can we in essence, create more value in our competitive environment
      • alternatives
        • scaling the business
        • entering new markets
        • innovating
        • acquiring a competitor/another player within a market segment of interest
      • recommendations
        • {question} what do we recommend doing going forward?
        • {question} what are the underlying assumptions of these recommendations?
        • {question} do they meet our tests that we might have for providing value?
        • move from analysis to action
          • actions come from asking a series of questions about what opportunities, what actions can we take moving forward
    • {step} strategy formulation
    • {step} strategy implementation
  • {tool} competitor analysis
    • {question} what market is the firm in, and who are the players in these markets? 
  • {tool} environmental analysis
    • {benefit} provides a picture on the broader competitive environment
    • {question} what are the major trends impacting this industry?
    • {question} are there changes in the sociopolitical environment that are going to have important implications for this industry?
    • {question} is this an attractive market or the barrier to competition?
  • {tool} five forces analysis
    • {benefit} provides an overview of the market structure/industry structure
    • {benefit} helps understand the nature of the competitive game that we are playing as we then devise future strategies [1]
      • provides a dynamic perspective in our understanding of a competitive market
    • {question} how's the competitive structure in a market likely to evolve?
  • {tool} competitive lifestyle analysis
  • {tool} SWOT (strengths, weaknesses, opportunities, threats) analysis
  • {tool} stakeholder analysis
    • {benefit} valuable in trying to understand those mission and values and then the others expectations of a firm
  • {tool} capabilities analysis
    • {question} what are the firm's unique resources and capabilities?
    • {question} how sustainable as any advantage that these assets provide?
  • {tool} portfolio planning matrix
    • {benefit} helps us now understand how they might leverage these assets across markets, so as to improve their position in any given market here
    • {question} how should we position ourselves in the market relative to our rivals?
  • {tool} capability analysis
    • {benefit} understand what the firm does well and see what opportunities they might ultimately want to attack and go after in terms of these valuable competitive positions
      • via Strategy Maps and Portfolio Planning matrices
  • {tool} hypothesis testing
    • {question} how competitors are likely to react to these actions?
    • {question} does it make sense in the future worlds we envision?
    • [game theory] pay off matrices can be useful to understand what actions taken by various competitors within an industry
  • {tool} scenario planning
    • {benefit} helps us envision future scenarios and then work back to understand what are the actions we might need to take in those various scenarios if they play out.
    • {question} does it provide strategic flexibility?
  • {tool} real options analysis 
    • highlights the desire to have strategic flexibility or at least the value of strategic flexibility provides
  • {tool} acquisition analysis
    • {benefit} helps understand the value of certain action versus others
    • {benefit} useful as an understanding of opportunity costs for other strategic investments one might make
    • focused on mergers and acquisitions
  • {tool} If-Then thinking
    • sequential in nature
      • different from causal logic
        • commonly used in network diagrams, flow charts, Gannt charts, and computer programming
  • {tool} Balanced Scorecard
    • {definition} a framework to look at the strategy used for value creation from four different perspectives [5]
      • {perspective} financial 
        • {scope} the strategy for growth, profitability, and risk viewed from the perspective of the shareholder [5]
        • {question} what are the financial objectives for growth and productivity? [5]
        • {question} what are the major sources of growth? [5]
        • {question} If we succeed, how will we look to our shareholders? [5]
      • {perspective} customer
        • {scope} the strategy for creating value and differentiation from the perspective of the customer [5]
        • {question} who are the target customers that will generate revenue growth and a more profitable mix of products and services? [5]
        • {question} what are their objectives, and how do we measure success with them? [5]
      • {perspective} internal business processes
        • {scope} the strategic priorities for various business processes, which create customer and shareholder satisfaction [5] 
      • {perspective} learning and growth 
        • {scope} defines the skills, technologies, and corporate culture needed to support the strategy. 
          • enable a company to align its human resources and IT with its strategy
      • {benefit} enables the strategic hypotheses to be described as a set of cause-and-effect relationships that are explicit and testable [5]
        • require identifying the activities that are the drivers (or lead indicators) of the desired outcomes (lag indicators)  [5]
        • everyone in the organization must clearly understand the underlying hypotheses, to align resources with the hypotheses, to test the hypotheses continually, and to adapt as required in real time [5]
    • {tool} strategy map
      • {definition} a visual representation of a company’s critical objectives and the crucial relationships that drive organizational performance [2]
        • shows the cause-and effect links by which specific improvements create desired outcomes [2]
      • {benefit} shows how an organization will convert its initiatives and resources–including intangible assets such as corporate culture and employee knowledge into tangible outcomes [2]
    • {component} mission
      • {question} why we exist?
    • {component} core values
      • {question} what we believe in?
      • ⇐ mission and the core values  remain fairly stable over time [5]
    • {component} vision
      • {question} what we want to be?
      • paints a picture of the future that clarifies the direction of the organization [5]
        • helps-individuals to understand why and how they should support the organization [5]
    Previous Post <<||>> Next Post

    References:
    [1] University of Virginia (2022) Strategic Planning and Execution (MOOC, Coursera)
    [2] Robert S Kaplan & David P Norton (2000) Having Trouble with Your Strategy? Then Map It (link)
    [3] Harold Kerzner (2001) Strategic planning for project management using a project management maturity model
    [4] Alfred D Chandler Jr. (1962) "Strategy and Structure"
    [5] Robert S Kaplan & David P Norton (2000) The Strategy-focused Organization: How Balanced Scorecard Companies Thrive in the New Business Environment

    14 December 2023

    ERP Systems: Microsoft Dynamics 365's Invoice Capture (Some Thoughts to Start with)

    Enterprise Resource Planning

    Introduction

    It's almost the year end and it's time for reviewing what went good and not that good during the year. On the "successful projects" list I can put the Invoice Capture implementation. I wrote on a previous post a short review on what the feature is about.

    I had the chance of configuring Invoice Capture for Cost invoices (invoices without Purchase orders) while it was still in public preview, and we went live soon after the feature became generally available. The implementation had its challenges though in the end it was a positive experience, learning a lot from my colleagues, from Microsoft, other consultants and business users who embarked on the same journey. 

    Where to Start?

    Usually, it's a good idea to start with the documentation and the standard training material, which provides a good overview of what Invoice Capture is about, the steps needed for configuration, the processes involved, permissions, etc. 

    You should check also the "Invoice Capture for Dynamics 365" group on Yammer (aka Viva) because besides the latest version of the Implementation Guide document are published in there also the Release notes and training videos associated with them, to which other users provide (lot of) feedback and questions. Some information is first available in the group and much later made available in the documentation. If you're facing an error or a challenge, more likely there's a conversation in there and the answer you're looking for. Otherwise, you can start a thread and the others will try to help. At least until now, Microsoft was quite active in helping.

    Via FastTrack, Microsoft provided several sessions in Dec-2022 (preview) and Sep-2023 (GA) that can be used to get a good overview about the feature and its implementation. Frankly, I would start with the last session and then explore the other resources. In the process I found useful several other resources, mainly YouTube content - see Dan's Corner (link), DAFTD365 - and LinkedIn - see Hendrik M Larsen's posts

    You might want to also check the Release planner for Finance, to see what features are in the pipeline, respectively the Ideas for Dynamics to get an idea what kind of improvements others wish for. 

    In parallel, one can start sketching the "AS IS" and "TO BE" processes, and eventually put together a business case for using Invoice Capture to digitize the processing of Vendor invoices. This isn't a simple Change request, therefore it makes sense to start a project, though its scope is relatively small. 

    Bridging the Gap

    One can look at the "TO BE" process based on the functionality provided, respectively planned by Microsoft for Invoice capture, or look at the broader picture and sketch how an ideal digitized process should look like. If the gap between the two pictures is big, then might be a good idea to look at alternatives, which anyway should be done as part of the business case. There might be third-party tools out there (e,g, ExFlow) which provide similar functionality, however on the long term it makes sense to go with Microsoft, even if the full extent of the functionality might be not available. 

    A review of other tools might be good - to understand how the ISV's approached the integration, what kind of features they provide, respectively whether the ideal digitized process makes sense. Conversely, this will imply more effort.

    The current version of Invoice capture provides a good basis to build upon. One can use Power Apps or Power Automate to address some of the gaps, some gaps can be discussed with Microsoft and stress their importance, while other gaps are maybe not that important and can be dismissed. One way or another one must be ready to compromise as long as this doesn't have an important impact on the business. 

    The Project

    The scope of the project might be relatively small, though one should follow the best practices of Project Management and make sure that all important stakeholders are involved, that the right resources are available when needed, manage the requirements adequately, assure that the changes are adequately tested, that the users are trained, the process documented, etc.

    It's important to understand that the simple configuration of Invoice capture will not be the end of the effort. As Microsoft will release further features directly and indirectly related to Invoice Capture, additional effort might be involved after the implementation went live to address the gaps, opportunities, as well as the risks. Moreover, Invoice Capture requires a learning curve; addressing the lessons learned might involve further changes in the system's setup as well in data's management. Therefore, further effort must be planned accordingly. 

    Even if we talk about a full implementation or the implementation of a feature, the overall success tends to be more dependent on how the implementation is approached than on the technology involved. 

    Closing Thoughts

    Some of the points made here can be applied to similar feature implementations. Overall, it's important to gather enough information to start the project and in time to reach the level of depth required by it. Don't expect for things to be perfect, start small and evolve, prioritize, cover the gaps, optimize!

    Previous post <<||>> Next post

    10 December 2023

    ERP Systems: Microsoft Dynamics 365's Invoice Capture (The Good, the Bad and the Ugly)

    Enterprise Resource Planning
    ERP Systems

    At the last meeting of the Microsoft Dynamics Meetup Germany group there were about 20 Microsoft experts invited to expose in two minutes their favorite new feature from the Microsoft ecosystem. None of them though mentioned Invoice Capture, which I think deserves its place on the list. 

    Invoice Capture is a Power Apps-based application deeply integrated with Dynamics 365 which allows the semi-automatic processing of Vendor invoices received over the various channels (Outlook, SharePoint, OneDrive) or via manual upload. The Power App listens on the configured channels, imports the documents as they arrive and uses optical character recognition capabilities to extract the standard textual information needed to create a Vendor invoice record with header, lines and further information. In a first phase the Accountant Clerk classifies, reviews, corrects and transfers the Invoice to Dynamics 365, from where the Invoice follows the standard process being enriched, posted to the Subledger and further booked to General Ledger. Of course, several changes were done also in Dynamics 365, especially in what concerns the parametrization and Invoices' automatic processing.

    The Good: Thus, Invoice Capture attempts to provide end-to-end invoice automation and probably with further changes to cover at least the most common scenarios it will be able to do so. Since it was released as a minimal viable product (MVP), besides bug fixing several features were added - the search of Vendors and their automatic synchronization, the entry of Cost Center and Department Code financial dimensions upfront in the Power App, the support for multiple tax codes and for custom fields, just to mentioned the most important features. However, more changes are needed to provide customers more flexibility in automating the process and in handling other complex scenarios. 

    Through automation and further features like the continuous learning from manual input and previous value retention, Invoice Capture decreases the volume of manual work and increases the financial cycle time, making the overall process more efficient. Moreover, the invoices are available almost as soon as they came into Dynamics 365, allowing better overview and thus better spend control. Features like Invoice approval via workflows and extrinsic automating features can offer further opportunities for improvement. Last but not the least, Invoice Capture allows achieving a paperless AP, helping organizations' effort on their road to digital transformation. 

    The Bad: It's natural in Software Development to start with a MVP and built upon, however the gap between the MVP and what customers need involves certain challenges when evaluating and implementing the feature(s). Some hiccups are inherent as a piece of software needs time to stabilize and mature, however with better transparency and communication about the roadmap the respective processes would have been a better experience. On the other side, Microsoft was quite helpful in the process, welcoming the feedback and integrating it in the plan, and in time even provided more transparency. However, there seem to be still many unknowns, especially in what concerns the integration with old and new features from the roadmap (e,g. e-Invoicing, recurring Vendor invoices).

    The truth is that customers have different needs, their processes have degrees of complexity that may go way beyond the features provided by the MVP and subsequent versions. Some customers were happy with the MVP, some had to compromise while others maybe went for alternatives. 

    The Ugly: It's time consuming to evaluate and implement a new feature, to fill the gaps and find alternatives, especially when the organizational setup is not optimal. However all these are normal challenges from the life of an ERP consultant.

    Despite the current and maybe future challenges, Invoice Capture can become in time an important product on the Microsoft roadmap.

    Previous Post <<||>> Next Post

    20 March 2021

    Business Intelligence: New Technologies, Old Challenges I (Introduction)

    Business Intelligence

    Each important technology has the potential of creating divides between the specialists from a given field. This aspect is more suggestive in the data-driven fields like BI/Analytics or Data Warehousing. The data professionals (engineers, scientists, analysts, developers) skilled only in the new wave of technologies tend to disregard the role played by the former technologies and their role in the data landscape. The argumentation for such behavior is rooted in the belief that a new technology is better and can solve any problem better than previous technologies did. It’s a kind of mirage professionals and customers can easily fall under.

    Being bigger, faster, having new functionality, doesn’t make a tool the best choice by default. The choice must be rooted in the problem to be solved and the set of requirements it comes with. Just because a vibratory rammer is a new technology, is faster and has more power in applying pressure, this doesn’t mean that it will replace a hammer. Where a certain type of power is needed the vibratory rammer might be the best tool, while for situations in which a minimum of power and probably more precision is needed, like driving in a nail, then an adequately sized hammer will prove to be a better choice.

    A technology is to be used in certain (business/technological) contexts, and even if contexts often overlap, the further details (aka requirements) should lead to the proper use of tools. It’s in a professional’s duties to be able to differentiate between contexts, requirements and the capabilities of the tools appropriate for each context. In this resides partially a professional’s mastery over its field of work and of providing adequate solutions for customers’ needs. Especially in IT, it’s not enough to master the new tools but also have an understanding about preceding tools, usage contexts, capabilities and challenges.

    From an historical perspective each tool appeared to fill a demand, and even if maybe it didn’t manage to fill it adequately, the experience obtained can prove to be valuable in one way or another. Otherwise, one risks reinventing the wheel, or more dangerously, repeating the failures of the past. Each new technology seems to provide a deja-vu from this perspective.

    Moreover, a new technology provides new opportunities and requires maybe to change our way of thinking in respect to how the technology is used and the processes or techniques associated with it. Knowledge of the past technologies help identifying such opportunities easier. How a tool is used is also a matter of skills, while its appropriate use and adoption implies an inherent learning curve. Having previous experience with similar tools tends to reduce the learning curve considerably, though hands-on learning is still necessary, and appropriate learning materials or tutoring is upon case needed for a smoother transition.

    In what concerns the implementation of mature technologies, most of the challenges were seldom the technologies themselves but of non-technical nature, ranging from the poor understanding/knowledge about the tools, their role and the implications they have for an organization, to an organization’s maturity in leading projects. Even the most-advanced technology can fail in the hands of non-experts. Experience can’t be judged based only on the years spent in the field or the number of projects one worked on, but on the understanding acquired about implementation and usage’s challenges. These latter aspects seem to be widely ignored, even if it can make the difference between success and failure in a technology’s implementation.

    Ultimately, each technology is appropriate in certain contexts and a new technology doesn’t necessarily make another obsolete, at least not until the old contexts become obsolete.

    Previous Post <<||>>Next Post

    08 November 2007

    Software Engineering: Implementation (Just the Quotes)

    "The term architecture is used here to describe the attributes of a system as seen by the programmer, i.e., the conceptual structure and functional behavior, as distinct from the organization of the data flow and controls, the logical design, and the physical implementation." (Gene Amdahl et al, "Architecture of the IBM System", IBM Journal of Research and Development. Vol 8 (2), 1964)

    "In computer design three levels can be distinguished: architecture, implementation and realisation; for the first of them, the following working definition is given: The architecture of a system can be defined as the functional appearance of the system to the user, its phenomenology. […] The inner structure of a system is not considered by the architecture: we do not need to know what makes the clock tick, to know what time it is. This inner structure, considered from a logical point of view, will be called the implementation, and its physical embodiment the realisation." (Gerrit A Blaauw, "Computer Architecture", 1972)

    "Of course the technological base on which one builds is always advancing. As soon as one freezes a design, it becomes obsolete in terms of its concepts. But implementation of real products demands phasing and quantizing. The obsolescence of an implementation must be measured against other existing implementations, not against unrealized concepts. The challenge and the mission are to find real solutions to real problems on actual schedules with available resources." (Fred P Brooks, "The Mythical Man-Month: Essays", 1975)

    "The separation of architectural effort from implementation is a very powerful way of getting conceptual integrity on very large projects." (Fred P Brooks, "The Mythical Man-Month: Essays", 1975)

    "Systems with unknown behavioral properties require the implementation of iterations which are intrinsic to the design process but which are normally hidden from view. Certainly when a solution to a well-understood problem is synthesized, weak designs are mentally rejected by a competent designer in a matter of moments. On larger or more complicated efforts, alternative designs must be explicitly and iteratively implemented. The designers perhaps out of vanity, often are at pains to hide the many versions which were abandoned and if absolute failure occurs, of course one hears nothing. Thus the topic of design iteration is rarely discussed. Perhaps we should not be surprised to see this phenomenon with software, for it is a rare author indeed who publicizes the amount of editing or the number of drafts he took to produce a manuscript." (Fernando J Corbató, "A Managerial View of the Multics System Development", 1977)

    "The design of a digital system starts with the specification of the architecture of the system and continues with its implementation and its subsequent realisation... the purpose of architecture is to provide a function. Once that function is established, the purpose of implementation is to give a proper cost-performance and the purpose of realisation is to build and maintain the appropriate logical organisation." (Gerrit A Blaauw, "Specification of Digital Systems", Proc. Seminar in Digital Systems Design, 1978)

    "With increasing size and complexity of the implementations of information systems, it is necessary to use some logical construct (or architecture) for defining and controlling the interfaces and the integration of all of the components of the system." (John Zachman, "A Framework for Information Systems Architecture", 1987)

    "Project failures are not always the result of poor methodology; the problem may be poor implementation. Unrealistic objectives or poorly defined executive expectations are two common causes of poor implementation. Good methodologies do not guarantee success, but they do imply that the project will be managed correctly." (Harold Kerzner, "Strategic Planning for Project Management using a Project Management Maturity Model", 2001)

    "Most dashboards fail to communicate efficiently and effectively, not because of inadequate technology (at least not primarily), but because of poorly designed implementations. No matter how great the technology, a dashboard's success as a medium of communication is a product of design, a result of a display that speaks clearly and immediately. Dashboards can tap into the tremendous power of visual perception to communicate, but only if those who implement them understand visual perception and apply that understanding through design principles and practices that are aligned with the way people see and think." (Stephen Few, "Information Dashboard Design", 2006)

    "Design is the bridging activity between gathering and implementation of software requirements that satisfies the required needs. […] The fundamental goal of design is to reduce the number of dependencies between modules, thus reducing the complexity of the system. This is also known as coupling; lesser the coupling the better is the design. On the other hand, higher the binding between elements within a module (known as cohesion) the better is the design." (Vasudeva Varma, "Software Architecture: A Case Based Approach", 2009)

    "As a general rule, implementations do not just spontaneously combust. Failures tend to stem from the aggregation of many issues. Although some issues may have been known since the early stages of the project (for example, the sales cycle or system design), implementation teams discover the majority of problems during the middle of the implementation, typically during some form of testing." (Phil Simon, "Why New Systems Fail: An Insider’s Guide to Successful IT Projects", 2010)

    "Implementation issues are not confined to the data and system realms. On the contrary, many of the problems encountered during a typical implementation stem from people, the roles they are required to play, political issues, and comfort zones." (Phil Simon, "Why New Systems Fail: An Insider’s Guide to Successful IT Projects", 2010)

    "Implementing new systems is not like baking a cake. Organizations cannot follow a recipe with the following ingredients: three consultants, six weeks of testing, two training classes, and a healthy dose of project management. Nor do projects bake for six months until complete, after which time everyone enjoys a delicious piece of cake. For all sorts of reasons, a well-conceived and well-run project may fail, whereas a horribly managed project may come in under budget, ahead of schedule, and do everything that the vendor promised at the onset." (Phil Simon, "Why New Systems Fail: An Insider’s Guide to Successful IT Projects", 2010)

    "Implementing new systems provides organizations with unique opportunities not only to improve their technologies, but to redefine and improve key business processes. Ultimately, for organizations to consider these new systems successes, the post-legacy environment must ensure that business processes, client end users, and systems work together." (Phil Simon, "Why New Systems Fail: An Insider’s Guide to Successful IT Projects", 2010)

    "Pre-implementation, post-implementation, and ongoing data audits are invaluable tools for organizations. Used judiciously by knowledgeable and impartial resources, audits can detect, avoid, and minimize issues that can derail an implementation or cause a live system to fail. Rather than view them as superfluous expenses, organizations would be wise to conduct them at key points throughout the system’s life cycle." (Phil Simon, "Why New Systems Fail: An Insider’s Guide to Successful IT Projects", 2010)

    "Agile approaches to software development consider design and implementation to be the central activities in the software process. They incorporate other activities, such as requirements elicitation and testing, into design and implementation. By contrast, a plan-driven approach to software engineering identifies separate stages in the software process with outputs associated with each stage." (Ian Sommerville, "Software Engineering" 9th Ed., 2011)

    "Good architecture provides good interfaces that separate the shear layers of its implementation: a necessity for evolution and maintenance. Class-oriented programming puts both data evolution and method evolution in the same shear layer: the class. Data tend to remain fairly stable over time, while methods change regularly to support new services and system operations. The tension in these rates of change stresses the design." (James O Coplien & Trygve Reenskaug, "The DCI Paradigm: Taking Object Orientation into the Architecture World", 2014)

    05 June 2007

    Software Engineering: Implementation (Definitions)

    "Carrying out of planned activity." (Timothy J  Kloppenborg et al, "Project Leadership", 2003)

    "(1) The process of translating a design into hardware components, software components, or both. Includes detailed design, coding (for software), fabrication and inspection (for hardware), and unit (component) test. For software, detailed design and coding are usually combined. (2) The result of the process in (1). Also called construction." (Richard D Stutzke, "Estimating Software-Intensive Systems: Projects, Products, and Processes", 2005)

    "The process of creating software from a design of that software. A physical database is an implementation of a database model." (Gavin Powell, "Beginning Database Design", 2006)

    "Deploying a software solution in a company is accomplished through an iterative sequence of activities. These activities are bundled into an implementation project." (Janice M Roehl-Anderson, "IT Best Practices for Financial Managers", 2010)

    "All organizational activities involved in the introduction, management, and acceptance of technology to support one or more organizational processes." (Linda Volonino & Efraim Turban, "Information Technology for Management" 8th Ed., 2011)

    "Installing and converting to use of a software application." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

    "Execution or fulfillment of a plan or design; putting into action." (Joan C Dessinger, "Fundamentals of Performance Improvement" 3rd Ed., 2012)

    "How a piece of code actually goes about doing its job. Users of the code should not count on implementation details staying the same unless they are part of the published interface." (Jon Orwant et al, "Programming Perl" 4th Ed., 2012)

    "The activity of making the essential requirements work in the real world." (James Robertson et al, "Complete Systems Analysis: The Workbook, the Textbook, the Answers", 2013)

    "When used by programmers, this term usually means writing the code. When used by managers, this often means deployment." (Rod Stephens, "Beginning Software Engineering", 2015)

    19 February 2007

    Software Engineering: White-box Testing (Definitions)

    "An implementation-based test, in contrast to a specification-based test" (Johannes Link & Peter Fröhlich, "Unit Testing in Java", 2003)

    "This test is derived knowing the inner structure of the software and based on the program code, design, interface descriptions, and so on. White-box tests are also called» structure based tests." (Lars Dittmann et al, "Automotive SPICE in Practice", 2008)

    [white box test design technique:] "Any technique used to derive and/or select test cases based on an analysis of the internal structure of the test object (see also structural test)." (Tilo Linz et al, "Software Testing Foundations" 4th Ed., 2014)

    "A software testing methodology that examines the code of an application. This contrasts with black box testing, which focuses only on inputs and outputs of an application." (Mike Harwood, "Internet Security: How to Defend Against Attackers on the Web" 2nd Ed., 2015)

    "A test designed by someone who knows how the code works internally. That person can guess where problems may lie and create tests specifically to look for those problems." (Rod Stephens, "Beginning Software Engineering", 2015)

    "The testing method where test cases are generated in order to test a program at a source code level." Pedro Delgado-Pérez et al, "Mutation Testing", 2015)

    "A testing technique to test the internal structure, design and coding of a software solution." (Pooja Kaplesh & Severin K Y Pang, "Software Testing, Software Engineering for Agile Application Development", 2020)

    "A test methodology that assumes explicit and substantial knowledge of the internal structure and implementation detail of the assessment object." (NIST SP 800-137)

    [white-box test design technique:] "Procedure to derive and select test cases based on an analysis of the internal structure of a component or system." (ISTQB)

    "Testing based on an analysis of the internal structure of the component or system." (ISTQB)

    30 November 2006

    David Parmenter - Collected Quotes

    "All good KPIs that I have come across, that have made a difference, had the CEO’s constant attention, with daily calls to the relevant staff. [...] A KPI should tell you about what action needs to take place. [...] A KPI is deep enough in the organization that it can be tied down to an individual. [...] A good KPI will affect most of the core CSFs and more than one BSC perspective. [...] A good KPI has a flow on effect." (David Parmenter, "Pareto’s 80/20 Rule for Corporate Accountants", 2007)

    "If the KPIs you currently have are not creating change, throw them out because there is a good chance that they may be wrong. They are probably measures that were thrown together without the in-depth research and investigation KPIs truly deserve." (David Parmenter, "Pareto’s 80/20 Rule for Corporate Accountants", 2007)

    "Many management reports are not a management tool; they are merely memorandums of information. As a management tool, management reports should encourage timely action in the right direction, by reporting on those activities the Board, management, and staff need to focus on. The old adage “what gets measured gets done” still holds true." (David Parmenter, "Pareto’s 80/20 Rule for Corporate Accountants", 2007)

    "Reporting to the Board is a classic 'catch-22' situation. Boards complain about getting too much information too late, and management complains that up to 20% of their time is tied up in the Board reporting process. Boards obviously need to ascertain whether management is steering the ship correctly and the state of the crew and customers before they can relax and 'strategize' about future initiatives. The process of assessing the current status of the organization from the most recent Board report is where the principal problem lies. Board reporting needs to occur more efficiently and effectively for both the Board and management." (David Parmenter, "Pareto’s 80/20 Rule for Corporate Accountants", 2007)

    "Financial measures are a quantification of an activity that has taken place; we have simply placed a value on the activity. Thus, behind every financial measure is an activity. I call financial measures result indicators, a summary measure. It is the activity that you will want more or less of. It is the activity that drives the dollars, pounds, or yen. Thus financial measures cannot possibly be KPIs." (David Parmenter, "Key Performance Indicators: Developing, implementing, and using winning KPIs" 3rd Ed., 2015)

    "'Getting it right the first time' is a rare achievement, and ascertaining the organization’s winning KPIs and associated reports is no exception. The performance measure framework and associated reporting is just like a piece of sculpture: you can be criticized on taste and content, but you can’t be wrong. The senior management team and KPI project team need to ensure that the project has a just-do-it culture, not one in which every step and measure is debated as part of an intellectual exercise." (David Parmenter, "Key Performance Indicators: Developing, implementing, and using winning KPIs" 3rd Ed., 2015)

    "In order to get measures to drive performance, a reporting framework needs to be developed at all levels within the organization." (David Parmenter, "Key Performance Indicators: Developing, implementing, and using winning KPIs" 3rd Ed., 2015)

    "Key performance indicators (KPIs) are those indicators that focus on the aspects of organizational performance that are the most critical for the current and future success of the organization." (David Parmenter, "Key Performance Indicators: Developing, implementing, and using winning KPIs" 3rd Ed., 2015)

    "Key Performance Indicators (KPIs) in many organizations are a broken tool. The KPIs are often a random collection prepared with little expertise, signifying nothing. [...] KPIs should be measures that link daily activities to the organization’s critical success factors (CSFs), thus supporting an alignment of effort within the organization in the intended direction." (David Parmenter, "Key Performance Indicators: Developing, implementing, and using winning KPIs" 3rd Ed., 2015)

    "Most organizational measures are very much past indicators measuring events of the last month or quarter. These indicators cannot be and never were KPIs." (David Parmenter, "Key Performance Indicators: Developing, implementing, and using winning KPIs" 3rd Ed., 2015)

    "The traditional balanced-scorecard (BSC) approach uses performance measures to monitor the implementation of the strategic initiatives, and measures are typically cascaded down from a top-level organizational measure such as return on capital employed. This cascading of measures from one another will often lead to chaos, with hundreds of measures being monitored by staff in some form of BSC reporting application." (David Parmenter, "Key Performance Indicators: Developing, implementing, and using winning KPIs" 3rd Ed., 2015)

    "We need indicators of overall performance that need only be reviewed on a monthly or bimonthly basis. These measures need to tell the story about whether the organization is being steered in the right direction at the right speed, whether the customers and staff are happy, and whether we are acting in a responsible way by being environmentally friendly. These measures are called key result indicators (KRIs)." (David Parmenter, "Key Performance Indicators: Developing, implementing, and using winning KPIs" 3rd Ed., 2015)

    "Every day spent producing reports is a day less spent on analysis and projects." (David Parmenter)

    Related Posts Plugin for WordPress, Blogger...

    About Me

    My photo
    IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.