Showing posts with label maturity. Show all posts
Showing posts with label maturity. Show all posts

08 March 2025

#️⃣Software Engineering: Programming (Part XVI: The Software Quality Perspective and AI)

Software Engineering Series
Software Engineering Series

Organizations tend to complain about poor software quality developed in-house, by consultancy companies or third parties, without doing much in this direction. Unfortunately, this agrees with the bigger picture reflected by the quality standards adopted by organizations - people talk and complain about them, though they aren’t that eager to include them in the various strategies, or even if they are considered, they are seldom enforced adequately!

Moreover, even if quality standards are adopted, and a lot of effort may be spent in this direction (as everybody has strong opinions and there are many exceptions), as projects progress, all the good intentions come to an end, the rules fading on the way either because are too strict, too general, aren’t adequately prioritized or communicated, or there’s no time to implement (all of) them. This applies in general to programming and to the domains that revolve around data – Business Intelligence, Data Analytics or Data Science.

The volume of good quality code and deliverables is not only a reflection of an organization’s maturity in dealing with best practices but also of its maturity in handling technical debt, Project Management, software and data quality challenges. All these aspects are strongly related to each other and therefore require a systemic approach rather than focusing on the issues locally. The systemic approach allows organizations to bridge the gaps between business areas, teams, projects and any other areas of focus.

There are many questionable studies on the effect of methodologies on software quality and data issues, proclaiming that one methodology is better than the other in addressing the multifold aspects of software quality. Besides methodologies, some studies attempt to correlate quality with organizations’ size, management or programmers’ experience, the size of software, or whatever characteristic might seem to affect quality.

Bad code is written independently of companies’ size or programmer's experience, management or organization’s maturity. Bad code doesn’t necessarily happen all at once, but it can depend on circumstances, repetitive team, requirements and code changes. There are decisions and actions that sooner or later can affect the overall outcome negatively.

Rewriting the code from scratch might look like an approachable measure though it’s seldom the cost-effective solution. Allocating resources for refactoring is usually a better approach, though this tends to increase considerably the cost of projects, and organizations might be tempted to face the risks, whatever they might be. Independently of the approaches used, sooner or later the complexity of projects, requirements or code tends to kick back.

There are many voices arguing that AI will help in addressing the problems of software development, quality assurance and probably other areas. It’s questionable how much AI will help to address the gaps, non-concordances and other mistakes in requirements, and how it will develop quality code when it has basic "understanding" issues. Even if step by step all current issues revolving around AI will be fixed, it will take time and multiple iterations until meaningful progress will be made.

At least for now, AI tools like Copilot or ChatGPT can be used for learning a programming language or framework through predefined or ad-hoc prompts. Probably, it can be used also to identify deviations from best practices or other norms in scope. This doesn’t mean that AI will replace for now code reviews, testing and other practices used in assuring the quality of software, but it can be used as an additional method to check for what was eventually missed in the other methods.

AI may also have hidden gems that when discovered, polished and sized, may have a qualitative impact on software development and software. Only time will tell what’s possible and achievable.

06 April 2024

🧭Business Intelligence: Why Data Projects Fail to Deliver Real-Life Impact (Part II: There's Value in Failure)

Business Intelligence
Business Intelligence Series

"Results are nothing; the energies which produce them
and which again spring from them are everything."
(Wilhelm von Humboldt,  "On Language", 1836)

When the data is not available and is needed on a continuous basis then usually the solution is to redesign the processes and make sure the data becomes available at the needed quality level. Redesign involves additional costs for the business; therefore, it might be tempting to cancel or postpone data projects, at least until they become feasible, though they’re seldom feasible. 

Just because there’s a set of data, this doesn’t mean that there is important knowledge to be extracted from it, respectively that the investment is feasible. There’s however value in building experience in the internal resources, in identifying the challenges and the opportunities, in identifying what needs to be changed for harnessing the data. Unfortunately, organizations expect that somebody else will do the work for them instead of doing the jump by themselves, and this approach more likely will fail. It’s like expecting to get enlightened after a few theoretical sessions with a guru than walking the path by oneself. 

This is reflected also in organizations’ readiness to do the required endeavors for making the jump on the maturity scale. If organizations can’t approach such topics systematically and address the assumptions, opportunities, and risks adequately, respectively to manage the various aspects, it’s hard to believe that their data journey will be positive. 

A data journey shouldn’t be about politics even if some minds need to be changed in the process, at management as well as at lower level. If the leadership doesn’t recognize the importance of becoming an enabler for such initiatives, then the organization probably deserves to keep the status quo. The drive for change should come from the leadership even if we talk about data culture, data strategy, decision-making, or any critical aspect.

An organization will always need to find the balance between time, scope, cost, and quality, and this applies to operations, tactics, and strategies as well as to projects.  There are hard limits and lot of uncertainty associated with data projects and the tasks involved, limits reflected in cost and time estimations (which frankly are just expert’s rough guesses that can change for the worst in the light of new information). Therefore, especially in data projects one needs to be able to compromise, to change scope and timelines as seems fit, and why not, to cancel the projects if the objectives aren’t feasible anymore, respectively if compromises can’t be reached.

An organization must be able to take the risks and invest in failure, otherwise the opportunities for growth don’t change. Being able to split a roadmap into small iterative steps that allow besides breaking down the complexity and making progress to evaluate the progress and the knowledge resulted, respectively incorporate the feedback and knowledge in the next steps, can prove to be what organizations lack in coping with the high uncertainty. Instead, organizations seem to be fascinated by the big bang, thinking that technology can automatically fill the organizational gaps.

Doing the same thing repeatedly and expecting different results is called insanity. Unfortunately, this is what organizations and service providers do in what concerns Project Management in general and data projects in particular. Building something without a foundation, without making sure that the employees have the skillset, maturity and culture to manage the data-related tasks, challenges and opportunities is pure insanity!

Bottom line, harnessing the data requires a certain maturity and it starts with recognizing and pursuing opportunities, setting goals, following roadmaps, learning to fail and getting value from failure, respectively controlling the failure. Growth or instant enlightenment without a fair amount of sweat is possible, though that’s an exception for few in sight!

Previous Post <<||>> Next Post

17 February 2024

🧭Business Intelligence: A Software Engineer's Perspective (Part II: Major Knowledge Gaps)

Business Intelligence Series
Business Intelligence Series

Solving a problem requires a certain degree of knowledge in the areas affected by the problem, degree that varies exponentially with problem's complexity. This requirement applies to scientific fields with low allowance for errors, as well as to business scenarios where the allowance for errors is in theory more relaxed. Building a report or any other data artifact is closely connected with problem solving as the data artifacts are supposed to model the whole or parts of what is needed for solving the problem(s) in scope.

In general, creating data artifacts requires: (1) domain knowledge - knowledge of the concepts, processes, systems, data, data structures and data flows as available in the organization; (2) technical knowledge - knowledge about the tools, techniques, processes and methodologies used to produce the artifacts; (3) data literacy - critical thinking, the ability to understand and explore the implications of data, respectively communicating data in context; (4) activity management - managing the activities involved. 

At minimum, creating a report may require only narrower subsets from the areas mentioned above, depending on the complexity of the problem and the tasks involved. Ideally, a single person should be knowledgeable enough to handle all this alone, though that's seldom the case. Commonly, two or more parties are involved, though let's consider the two-parties scenario: on one side is the customer who has (in theory) a deep understanding of the domain, respectively on the other side is the data professional who has (in theory) a deep understanding of the technical aspects. Ideally, both parties should be data literates and have some basic knowledge of the other party's domain. 

To attack a business problem that requires one or more data artifacts both parties need to have a common understanding of the problem to be solved, of the requirements, constraints, assumptions, expectations, risks, and other important aspects associated with it. It's critical for the data professional to acquire the domain knowledge required by the problem, otherwise the solution has high chances to deviate from the expectations. The general issue is that there are multiple interactions that are iterative. Firstly, the interactions for building the needed common ground. Secondly, the interaction between the problem and reality. Thirdly, the interaction between the problem and parties’ mental models und understanding about the problem. 

The outcome of these interactions is that the problem and its requirements go through several iterations in which knowledge from the previous iterations are incorporated successively. With each important piece of knowledge gained, it's important to revise and refine the question(s), respectively the problem. If in each iteration there are also programming and further technical activities involved, the effort and costs resulted in the process can explode, while the timeline expands accordingly. 

There are several heuristics that could be devised to address these challenges: (1) build all the required knowledge in one person, either on the business or the technical side; (2) make sure that the parties have the required knowledge for approaching the problems in scope; (3) make sure that the gaps between reality and parties' mental models is minimal; (4) make sure that the requirements are complete and understood before starting the development; (5) adhere to methodologies that accommodate the necessary iterations and endeavor's particularities; (6) make sure that there's a halt condition for regularly reviewing the progress, respectively halting the work; (7) build an organizational culture to support all this. 

The list is open, and the heuristics aren't exclusive, so in theory any combination of them can be considered. Ideally, an organization should reflect all these heuristics in one form or another. The higher the coverage, the more mature the organization is. The question is how organizations with a suboptimal setup can change the status quo?

Previous Post <<||>> Next Post

15 February 2016

♜Strategic Management: Maturity (Definitions)

"The extent to which an organization has explicitly and consistently deployed processes that are documented, managed, measured, controlled, and continually improved. Organizational maturity may be measured via appraisals." (Sandy Shrum et al, "CMMI®: Guidelines for Process Integration and Product Improvement", 2003)

[process maturity:] "The extent to which an organization’s processes are defined, managed, measured, controlled, and continually improved. Process maturity implies continued improvement in the organization’s capability for performing its business activities, and indicates consistency in performing its processes throughout the organization." (Sally A Miller et al, "People CMM: A Framework for Human Capital Management 2nd Ed.", 2009)

[Organizational Project Management Maturity Model:] "A framework that defines knowledge, assessment, and improvement processes, based on Best Practices and Capabilities, to help organizations measure and mature their portfolio, program, and project management practices." (Project Management Institute, "Organizational Project Management Maturity Model (OPM3) 3rd Ed", 2013)

[Project Management Maturity:] "Project management processes measured by the ability of an organization to successfully initiate, plan, execute, and monitor and control individual projects. Project management maturity is limited to individual project execution and doesn't address key processes, Capabilities, or Best Practices at the organizational, portfolio, or program level. The focus of project management maturity is 'doing projects right'." (Project Management Institute, "Organizational Project Management Maturity Model (OPM3) 3rd Ed", 2013)

[Organizational Project Management Maturity:] "The level of an organization’s ability to deliver the desired strategic outcomes in a predictable, controllable, and reliable manner." (For Dummies, "PMP Certification All-in-One For Dummies" 2nd Ed., 2013)

"Within OPM3, maturity comprises not only the state of performance within portfolio, program, and project management, but also the organization's evolution toward that state as illustrated by SMCI." (Project Management Institute, "Organizational Project Management Maturity Model (OPM3) 3rd Ed., 2013)

"A measurement of the ability of an organization to undertake continuous improvement in a particular discipline." (Yassine Maleh et al, 'Strategic IT Governance and Performance Frameworks in Large Organizations", 2019)

"In relation to organizations or activities, the level of sophistication or development of a specific program or activity." (Sally-Anne Pitt, "Internal Audit Quality", 2014)

"(1) The capability of an organization with respect to the effectiveness and efficiency of its processes and work practices. (2) The capability of the software product to avoid failure as a result of defects in the software. [ISO 9126] See also reliability." (SQA)

"Measure of the reliability, efficiency and effectiveness of a process, function, etc." (ITIL)

29 March 2013

🔦Process Management: (Capability) Maturity Model [CMM] (Definitions)

[capability maturity model:] "A model that contains the essential elements of effective processes for one or more disciplines and describes an evolutionary improvement path from ad hoc, immature processes to disciplined, mature processes with improved quality and effectiveness." (Sandy Shrum et al, "CMMI®: Guidelines for Process Integration and Product Improvement", 2003)

[capability maturity model (CMM):] "A formal document describing the requirements for a 'good' process, using some structure or taxonomy. Process maturity models define how you “ought to” produce a product, and typically require that the process be defined, documented, taught, practiced, measured, improved, and enforced." (Richard D Stutzke, "Estimating Software-Intensive Systems: Projects, Products, and Processes", 2005)

"A model to categorize the maturity of an organization by different levels. Most famous are the Capability Maturity Model (CMM) and its successor, the Capability Maturity Model Integration (CMMI). Following this approach, many organizations have developed SOA maturity models." (Nicolai M Josuttis, "SOA in Practice", 2007)

"A Capability Maturity Model (CMM) is an evolutionary roadmap for implementing the vital practices from one or more domains of organizational process. It contains the essential elements of effective processes for one or more disciplines. It describes an evolutionary improvement path from an ad hoc, immature process to a disciplined, mature process with improved quality and effectiveness." (Sally A Miller et al, "People CMM: A Framework for Human Capital Management 2nd Ed.", 2009)

"A structured collection of characteristics of effective processes at progressive levels of quality and effectiveness. A maturity model provides a common language and a shared vision for process improvement, a standard for benchmarking, and a framework for prioritizing actions. A maturity model assumes a natural evolutionary path for organizational process improvement." (DAMA International, "The DAMA Dictionary of Data Management", 2011)

"A framework that describes, for a specific area of interest, a number of levels of sophistication at which activities in this area can be carried out." (Jim Davis & Aiman Zeid, "Business Transformation: A Roadmap for Maximizing Organizational Insights", 2014)

"First introduced by the Carnegie Mellon Software Engineering Institute in 1991 to improve the process of software development. However, their broader applicability was recognized, and the model was expanded in 2000 to apply to enterprise-wide process improvement." (Sally-Anne Pitt, "Internal Audit Quality", 2014)

[Capability Maturity Model Integration (CMMI):] "A process improvement approach that provides organizations with the essential elements of effective processes, which will improve their performance." (Adam Gordon, "Official (ISC)2 Guide to the CISSP CBK" 4th Ed.", 2015)

[capability maturity model integration (CMMI):] "A process model that captures the organization’s maturity and fosters continuous improvement." (Shon Harris & Fernando Maymi, "CISSP All-in-One Exam Guide" 8th Ed., 2018)

"A set of structured levels that describe how well an organization can reliably and sustainably produce required outcomes." (Yassine Maleh et al, 'Strategic IT Governance and Performance Frameworks in Large Organizations", 2019)

[Capability Maturity Model (CMM):] "A five level staged framework that describes the key elements of an effective software process. The Capability Maturity Model covers best practices for planning, engineering and managing software development and maintenance ." (IQBBA)

[Capability Maturity Model Integration (CMMI):] "A framework that describes the key elements of an effective product development and maintenance process. The Capability Maturity Model Integration covers best-practices for planning, engineering and managing product development and maintenance. (CMMI)

"A structured collection of elements that describe certain aspects of maturity in an organization, and aid in the definition and understanding of an organization's processes. A maturity model often provides a common language, shared vision and framework for prioritizing improvement actions." (SQA)

"A Maturity Model is a framework that is used as a benchmark for comparison when looking at an organisation's processes." (Experian) [source]

"A means of identifying and/or measuring the maturity of something of interest, such as a Service, Capability, Function, Skill, or Competency." (IF4IT)

20 December 2009

🧮ERP: Implementations (Part I: The Right ERP Software)

ERP Implementation
ERP Implementations Series

In ERP implementations, there are still many organizations that ignore their important needs and go with big names following the trends or first impressions, much like buying a car and going with big brands, ignoring the fact that ERP systems come in different sizes and one size usually doesn't fit all! Poor choices add to the fact that an ERP implementation is much like a Pandora's Box - no matter how much we like to be confident and connoisseur about it, the end result will probably surprise us!

Ideally, an organization should start with a process of self-discovery, if that's not already achieved! It should address the business strategy which considers the business needs in terms of processes, procedures, roles, etc. Once such aspects are understood, an organization must decide which is the right infrastructure to support all this, and this might resume to more than choosing an ERP software! Focusing only on the ERP software and building around it can work as well. Independently of the approach, one should expect surprises as strategy's execution proceeds!

Organizations might choose to talk with sales representatives or partners, have maybe several presentations with Q&A and awareness content. Sales representatives' skills, convincing tone or business relations of the sales representative can be a decisive factor in choosing the solution, such meetings barely scratching the surface! Many vendors provide similar functionality, though, as usual, the devil lies in details and the gaps are discovered usually after the fact!

The initial meetings usually involve a mix of experienced and inexperienced people and there are lot of questions worth answering! What's an ERP system about? How much can the participants articulate organization's needs, respectively identify which are the details that make the most important impact on the business? How much has the sales representative understood customers' business and the overall context? In definitive, the representative tries to sell a product! How much is the representative willing to dive into the requirements, analyze them and identify feasible solutions? How deep do such meetings need to be held? A few sales pitches are usually not enough! It might take weeks, a whole team of resources, multiple iterations until an accurate perspective is achieved and even then, surprises will appear later!

Sometime after Go-Live, most probably the organization will understand what an ERP system is about, which were the challenges, what is missing from the bigger picture, what should have been done, how much the expectations were met, etc. Unfortunately, for some organizations the contact with reality will be harsh, harsher probably than expected. Some organizations will learn from the whole process, and use the knowledge further, though there will be also many exceptions!

An organization must have a certain maturity when implementing an ERP solution, and the lack of maturity must be addressed by vendors or/and organizations themselves in order to increase the chances of success! Learning by doing syntagma can be applied to ERP implementations too, though the costs for experimentation and discovery are quite high in such projects!

The bottom line - the vendors want to sell a product and profit on customers' expenses, often not being interested in whether the product fits the purpose as long as more issues lead to more revenue. On the other hand, the customer wants an affordable flexible solution that allows the business to higher level of performance. The optimal solution lies somewhere in between, however many of the solutions the customers get are suboptimal. Probably, something must be changed in how the various parties work together, and this is one of the most important challenges that must be addressed!


Rewritten: Apr-2025

From were it started:  "Top Ten ERP Software Predictions for 2010"

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 25 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.