Showing posts with label limitations. Show all posts
Showing posts with label limitations. Show all posts

21 October 2023

Data Warehousing: Dynamics 365, the Data Lakehouse and the Medallion Architecture

Data Warehousing
Data Warehousing Series

An IT architecture is built and functions under a set of constraints that derive from architecture’s components. Usually, if we want flexibility or to change something in one area, this might have an impact in another area. This rule applies to the usage of the medallion architecture as well! 

In Data Warehousing the medallion architecture considers a multilayered approach in building a single source of truth, each layer denoting the quality of data stored in the lakehouse [1]. For the moment are defined 3 layers - bronze for raw data, silver for validated data, and gold for enriched data. The concept seems sound considering that a Data Lake contains all types of raw data of different quality that needs to be validated and prepared for reporting or other purposes.

On the other side there are systems like Dynamics 365 that synchronize the data in near-real-time to the Data Lake through various mechanisms at table and/or data entity level (think of data entities as views on top of other tables or views). The databases behind are relational and in theory the data should be of proper quality as needed by business.

The greatest benefit of serverless SQL pool is that it can be used to build near-real-time data analytics solutions on top of the files existing in the Data Lake and the mechanism is quite simple. On top of such files are built external tables in serverless SQL pool, tables that reflect the data model from the source systems. The external tables can be called as any other tables from the various database objects (views, stored procedures and table-valued functions). Thus, can be built an enterprise data model with dimensions, fact-like and mart-like entities on top of the synchronized filed from the Data Lake. The Data Lakehouse (= Data Warehouse + Data Lake) thus created can be used for (enterprise) reporting and other purposes.

As long as there are no special requirements for data processing (e.g. flattening hierarchies, complex data processing, high-performance, data cleaning) this approach allows to report the data from the data sources in near-real time (10-30 minutes), which can prove to be useful for operational and tactical reporting. Tapping into this model via standard Power BI and paginated reports is quite easy. 

Now, if it's to use the data medallion approach and rely on pipelines to process the data, unless one is able to process the data in near-real-time or something compared with it, a considerable delay will be introduced, delay that can span from a couple of hours to one day. It's also true that having the data prepared as needed by the reports can increase the performance considerably as compared to processing the logic at runtime. There are advantages and disadvantages to both approaches. 

Probably, the most important scenario that needs to be handled is that of integrating the data from different sources. If unique mappings between values exist, unique references are available in one system to the records from the other system, respectively when a unique logic can be identified, the data integration can be handled in serverless SQL pool.

Unfortunately, when compared to on-premise or Azure SQL functionality, the serverless SQL pool has important constraints - it's not possible to use scalar UDFs, tables, recursive CTEs, etc. So, one needs to work around these limitations and in some cases use the Spark pool or pipelines. So, at least for exceptions and maybe for strategic reporting a medallion architecture can make sense and be used in parallel. However, imposing it on all the data can reduce flexibility!

Bottom line: consider the architecture against your requirements!

Previous Post <<||>>> Next Post

[1] What is the medallion lakehouse architecture?
https://learn.microsoft.com/en-us/azure/databricks/lakehouse/medallion

18 October 2022

SQL Reloaded: Successive Price Increases/Discounts via Windowing Functions and CTEs

I was trying today to solve a problem that apparently requires recursive common table expressions, though they are not (yet) available in Azure Synapse serverless SQL pool. The problem can be summarized in the below table definition, in which given a set of Products with an initial Sales price, is needed to apply Price Increases successively for each Cycle. The cumulated increase is simulated in the last column for each line. 

Unfortunately, there is no SQL Server windowing function that allows multiplying incrementally the values of a column (similar as the running total works). However, there’s a mathematical trick that can be used to transform a product into a sum of elements by applying the Exp (exponential) and Log (logarithm) functions (see Solution 1), and which frankly is more elegant than applying CTEs (see Solution 2). 

-- create table with test data
SELECT *
INTO dbo.ItemPrices
FROM (VALUES ('ID001', 1000, 1, 1.02, '1.02')
, ('ID001', 1000, 2, 1.03, '1.02*1.03')
, ('ID001', 1000, 3, 1.03, '1.02*1.03*1.03')
, ('ID001', 1000, 4, 1.04, '1.02*1.03*1.03*1.04')
, ('ID002', 100, 1, 1.02, '1.02')
, ('ID002', 100, 2, 1.03, '1.02*1.03')
, ('ID002', 100, 3, 1.04, '1.02*1.03*1.04')
, ('ID002', 100, 4, 1.05, '1.02*1.03*1.04*1.05')
) DAT (ItemId, SalesPrice, Cycle, PriceIncrease, CumulatedIncrease)

-- reviewing the data
SELECT *
FROM dbo.ItemPrices

-- Solution 1: new sales prices with log & exp
SELECT ItemId
, SalesPrice
, Cycle
, PriceIncrease
, EXP(SUM(Log(PriceIncrease)) OVER(PARTITION BY Itemid ORDER BY Cycle)) CumulatedIncrease
, SalesPrice * EXP(SUM(Log(PriceIncrease)) OVER(PARTITION BY Itemid ORDER BY Cycle)) NewSalesPrice
FROM dbo.ItemPrices

-- Solution 2: new sales prices with recursive CTE
;WITH CTE 
AS (
-- initial record
SELECT ITP.ItemId
, ITP.SalesPrice
, ITP.Cycle
, ITP.PriceIncrease
, cast(ITP.PriceIncrease as decimal(38,6)) CumulatedIncrease
FROM dbo.ItemPrices ITP
WHERE ITP.Cycle = 1
UNION ALL
-- recursice part
SELECT ITP.ItemId
, ITP.SalesPrice
, ITP.Cycle
, ITP.PriceIncrease
, Cast(ITP.PriceIncrease * ITO.CumulatedIncrease as decimal(38,6))  CumulatedIncrease
FROM dbo.ItemPrices ITP
    JOIN CTE ITO
	  ON ITP.ItemId = ITO.ItemId
	 AND ITP.Cycle-1 = ITO.Cycle
)
-- final result
SELECT ItemId
, SalesPrice
, Cycle
, PriceIncrease
, CumulatedIncrease
, SalesPrice * CumulatedIncrease NewSalesPrice
FROM CTE
ORDER BY ItemId
, Cycle


-- validating the cumulated price increases (only last ones)
SELECT 1.02*1.03*1.03*1.04 
, 1.02*1.03*1.04*1.05

-- cleaning up
DROP TABLE IF EXISTS dbo.ItemPrices

Notes:
1. The logarithm in SQL Server’s implementation works only with positive numbers!
2. For simplification I transformed percentages (e.g. 1%) in values that are easier to multitply with (e.g. 1.01). The solution can be easily modified to consider discounts.
3. When CTEs are not available, one is forced to return to the methods used in SQL Server 2000 (I've been there) and thus use temporary tables or table variables with loops. Moreover, the logic can be encapsulated in multi-statement table-valued functions (see example), unfortunately, another feature not (yet) supported by serverless SQL pools. 
4. Unfortunately, STRING_AGG, which concatenates values across rows, works only with a GROUP BY clause. Anyway, its result is useless without the availability of a Eval function in SQL (see example), however the Expr function available in data flows could be used as workaround.
4. Even if I thought about the use of logarithms for transforming the product into a sum, I initially ignored the idea, thinking that the solution would be too complex to implement. So, the credit goes to another blogpost. Kudos!

Happy coding!

07 March 2021

Project Management: Agile Manifesto Reloaded I (Introduction)

 

Project Management

There are so many books written on agile methodologies, each attempting to depict the realities of software development projects. There are many truths considered in them, though they seem to blend in a complex texture in which the writer takes usually the position of a preacher in which the sins of the traditional technologies are contrasted with the agile principles. In extremis everything done in the past seems to be wrong, while the agile methods seem to be a panacea, which is seldom the case.

There are already 20 years since the agile manifesto was published and the methodologies adhering to the respective principles don’t seem to provide the expected success, suffering from the same chronical symptoms of their predecessors - they are poorly understood and implemented, tend to function after hammer’s principle, respectively the software development projects still deliver poor results. Moreover, there are more and more professionals who raise their voice against agile practices.

Frankly, the principles behind the agile manifesto make sense. A project should by definition satisfy stakeholders’ requirements, ideally through regular deliveries that incorporate the needed functionality while gradually seeking to get early feedback from customers, respectively involve the customer through all project’s duration, working together to deliver a feasible product. Moreover, self-organizing teams, face-to-face meetings, constant pace, technical excellence should allow minimizing the waste, respectively maximizing the efficiency in the project. Further aspects like simplicity, good design and architecture should establish a basis for success.

Re-reading the agile manifesto, even if each read pulls from experience more and more pro and cons, the manifesto continues to look like a Christmas wish-list. Even if the represented ideas make sense and satisfy a specific need, they are difficult to achieve in a project’s context and setup. Each wish introduces a constraint that brings with it its own limitations. Unfortunately, each policy introduced by a methodology follows the same pattern, no matter of the methodology considered. Moreover, the wishes cover only a small subset from a project’s texture, are general and let lot of space for interpretation and implementation, though the same can be said about any principles that don’t provide a coherent worldview or a conceptual model.

The software development industry needs a coherent worldview that reflects its assumptions, models, characteristics, laws and challenges. Software Engineering (SE) attempts providing such a worldview though unfortunately is too complex for many and there seem to be a big divide when considered in respect to the worldviews introduced by the various Project Management (PM) methodologies. Studying one or two PM methodologies, learning a few programming languages and even the hand on experience on a few projects won’t fill the gaps in knowledge associated with the SE worldview.

Organizations don’t seem to see the need for professionals of having a formal education in SE. On the other side is expected from employees to have by default some of the skillset required, which is not the case. Besides understanding and implementing a technology there are a set of knowledge areas in which the IT professional must have at least a high-level knowledge if it’s expected from him/her to think critically about the respective areas. Unfortunately, the lack of such knowledge leads sometimes to situations which can impact negatively projects.

Almost each important word from the agile manifesto pulls with it a set of concepts from a SE’ worldview – customer satisfaction, software delivery, working software, requirements management, change management, cooperation, teamwork, trust, motivation, communication, metrics, stakeholders’ management, good design, good architecture, lessons learned, performance management, etc. The manifesto needs to be regarded from a SE’s eyeglasses if one expects value from it.

Previous Post <<||>> Next Post

19 May 2020

Project Management: Some Thoughts on Planning I

Mismanagement

One of the issues in Project Management (PM) planning is that the planner idealizes a resource and activities performed by it much like a machine. Unlike machines whose uptime can approach 100%, a human resource can work at most 90% of the available time (aka utilization time), the remaining 10% being typically associated with interruptions – internal emails and meetings, casual communications, pauses, etc. For resources split between projects or operations the utilization time can be at most 70%, however a realistic value is in general between 40% and 60% on average. What does it mean this for a project?
So, if a resource has a volume of work W, the amount of time needed to complete the work would be at best W/UT, where UT is the utilization time of the respective resource. “At best” because in each project there are additional idle time resulted from waste related activities – waiting for sign off, for information, for other resource to complete the time, etc.
The utilization time is not the only factor to consider. Upon case, the delivered work can reach maybe on average 80% of the expected quality. This applies to documentation and concepts as well for written code, bug testing and other project activities. To reach in the range of 100% one more likely will need 4 times of the effort associated with reaching 80% of the expected quality, however this value is dependent also on people’s professionalism and the degree with which the requirements were understood and possibly achievable. Therefore, the values vehiculated can be regarded as “boundary” values.
Let’s consider a quality factor (QF) which has a value of 1 for 80%, with an increase of 0,25 for each 5% of quality increase. Thus, with an initial effort estimation of 100 days, this is how the resulted effort modifies for various UT and QF values:

Considering that a project can target between 60% and 95% UT, and between 80% and 95% quality, for an initial estimation of 100 days the actual project duration can range between 117 and 292 days, where the lowest, respectively the right bound values are more realistic.
The model is simplistic as it doesn’t reflect the nonlinear aspect of the factors involved and the dependencies existing between them. It also doesn’t reflect the maturity of an organization to handle the projects and the tasks involved. However, it can be used to increase the awareness in how the utilization time and expected quality can affect a project’s timeline, and to check on whether one’s planning is realistic.
For example, at project’s start one can target an UT of 70% and a quality of 85%, which for 100 days of estimated effort will result in about 178 days of actual effort. Now diving the value by the number of resources involved, e.g. 4, it results that the project could be finished in about 44,5 days. This value can be compared then with the actual plan in which the activities are listed.
During the project it would be useful to look on how the UT changed and by how much, to understand the impact the change has on the project. For example, a decrease of 5% in utilization time can delay the project with 2,5 days which is not much, though for a project of 1000 days with talk already about one month. Same, it will be helpful to check how much the quality deviated from the expectation, because a decrease in quality by 5% can result in an additional effort of extra 8 days, which for 1000 days would mean almost 4 months of delay.

13 May 2019

Software Engineering: Programming (Good Programmer, Bad Programmer)

Software Engineering
Software Engineering Series

The use of denominations like 'good' or 'bad' related to programmers and programming carries with it a thin separation between these two perceptional poles that represent the end results of the programming process, reflecting the quality of the code delivered, respectively the quality of a programmer’s effort and  behavior as a whole. This means that the usage of the two denominations is often contextual, 'good' and 'bad' being moving points on a imaginary value scale with a wide range of values within and outside the interval determined by the two.

The 'good programmer' label is a idealization of the traits associated with being a programmer – analyzing and understanding the requirements, filling the gaps when necessary, translating the requirements in robust designs, developing quality code with a minimum of overwork, delivering on-time, being able to help others, to work as part of a (self-organizing) team and alone, when the project requires it, to follow methodologies, processes or best practices, etc. The problem with such a definition is that there's no fix limit, considering that programmer’s job description can include an extensive range of requirements.

The 'bad programmer' label is used in general when programmers (repeatedly) fail to reach others’ expectations, occasionally the labeling being done independently of one’s experience in the field. The volume of bugs and mistakes, the fuzziness of designs and of the code written, the lack of comments and documentation, the lack of adherence to methodologies, processes, best practices and naming conventions are often considered as indicators for such labels. Sometimes even the smallest mistakes or the wrong perceptions of one’s effort and abilities can trigger such labels.

Labeling people as 'good' or 'bad' has the tendency of reinforcing one's initial perception, in extremis leading to self-fulfilling prophecies - predictions that directly or indirectly cause themselves to become true, by the very terms on how the predictions came into being. Thus, when somebody labels another as 'good' or 'bad' he more likely will look for signs that reinforce his previous believes. This leads to situations in which "good" programmers’ mistakes are easier overlooked than 'bad' programmers' mistakes, even if the mistakes are similar.

A good label can in theory motivate, while a bad label can easily demotivate, though their effects depend from person to person. Such labels can easily become a problem for beginners, because they can easily affect beginners' perception about themselves. It’s so easy to forget that programming is a continuous learning process in which knowledge is relative and highly contextual, each person having strengths and weaknesses.

Each programmer has a particular set of skills that differentiate him from other programmers. Each programmer is unique, aspect reflected in the code one writes. Expecting programmers to fit an ideal pattern is unrealistic. Instead of using labels one should attempt to strengthen the weaknesses and make adequate use of a person’s strengths. In this approach resides the seeds for personal growth and excellence.

There are also programmers who excel in certain areas - conceptual creativity, ability in problem identification, analysis and solving, speed, ingenuity of design and of making best use of the available tools, etc. Such programmers, as Randall Stross formulates it, “are an order of magnitude better” than others. The experience and skills harnessed with intelligence have this transformational power that is achievable by each programmer in time.

Even if we can’t always avoid such labeling, it’s important to become aware of the latent force the labels carry with them, the effect they have on our colleagues and teammates. A label can easily act as a boomerang, hitting us back long after it was thrown away.

12 May 2019

Software Engineering: Programming (Misconceptions about Programming II)

Software Engineering

Continuation

One of the organizational stereotypes is having a big room full of cubicles filled with employees. Even if programmers can work in such settings, improperly designed environments restrict to a certain degree the creativity and productivity, making more difficult employees' collaboration and socialization. Despite having dedicated meeting rooms, an important part of the communication occurs ad-hoc. In open spaces each transient interruption can easily lead inadvertently to loss of concentration, which leads to wasted time, as one needs retaking thoughts’ thread and reviewing the last written code, and occasionally to bugs.

Programming is expected to be a 9 to 5 job with the effective working time of 8 hours. Subtracting the interruptions, the pauses one needs to take, the effective working time decreases to about 6 hours. In other words, to reach 8 hours of effective productivity one needs to work about 10 hours or so. Therefore, unless adequately planned, each project starts with a 20% of overtime. Moreover, even if a task is planned to take 8 hours, given the need of information the allocated time is split over multiple days. The higher the need for further clarifications the higher the chances for effort to expand. In extremis, the effort can double itself.

Spending extensive time in front of the computer can have adverse effects on programmers’ physical and psychical health. Same effect has the time pressure and some of the negative behavior that occurs in working environments. Also, the communication skills can suffer when they are not properly addressed. Unfortunately, few organizations give importance to these aspects, few offer a work free time balance, even if a programmer’s job best fits and requires such approach. What’s even more unfortunate is when organizations ignore the overtime, taking it as part of job’s description. It’s also one of the main reasons why programmers leave, why competent workforce is lost. In the end everyone’s replaceable, however what’s the price one must pay for it?

Trainings are offered typically within running projects as they can be easily billed. Besides the fact that this behavior takes time unnecessarily from a project’s schedule, it can easily make trainings ineffective when the programmers can’t immediately use the new knowledge. Moreover, considering resources that come and go, the unwillingness to invest in programmers can have incalculable effects on an organization performance, respectively on their personal development.

Organizations typically look for self-motivated resources, this request often encompassing organization’s whole motivational strategy. Long projects feel like a marathon in which is difficult to sustain the same rhythm for the whole duration of the project. Managers and team leaders need to work on programmers’ motivation if they need sustained performance. They must act as mentors and leaders altogether, not only to control tasks’ status and rave and storm each time deviations occur. It’s easy to complain about the status quo without doing anything to address the existing issues (challenges).

Especially in dysfunctional teams, programmers believe that management can’t contribute much to project’s technical aspects, while management sees little or no benefit in making developers integrant part of project's decisional process. Moreover, the lack of transparence and communication lead to a wide range of frictions between the various parties.

Probably the most difficult to understand is people’s stubbornness in expecting different behavior by following the same methods and of ignoring the common sense. It’s bewildering the easiness with which people ignore technological and Project Management principles and best practices. It resides in human nature the stubbornness of learning on the hard way despite the warnings of the experienced, however, despite the negative effects there’s often minimal learning in the process...

To be eventually continued…

Software Engineering: Programming (Misconceptions about Programming - Part I)

Software Engineering
Software Engineering Series

Besides equating the programming process with a programmer’s capabilities, minimizing the importance of programming and programmers’ skills in the whole process (see previous post), there are several other misconceptions about programming that influence process' outcomes.


Having a deep knowledge of a programming language allows programmers to easily approach other programming languages, however each language has its own learning curve ranging from a few weeks to half of year or more. The learning curve is dependent on the complexity of the languages known and the language to be learned, same applying to frameworks and architectures, the scenarios in which the languages are used, etc. One unrealistic expectation is that the programmers are capablle of learning a new programming language or framework overnight, this expectation pushing more pressure on programmers’ shoulders as they need to compensate in a short time for the knowledge gap. No, the programming languages are not the same even if there’s high resemblance between them!

There’s lot of code available online, many of the programming tasks involve writing similar code. This makes people assume that programming can resume to copy-paste activities and, in extremis, that there’s no creativity into the act of programming. Beside the fact that using others’ code comes with certain copyright limitations, copy-pasting code is in general a way of introducing bugs in software. One can learn a lot from others’ code, though programmers' challenge resides in writing better code, in reusing code while finding the right the level of abstraction.  
 
There’s the tendency on the market to build whole applications using wizard-like functionality and of generating source-code based on data or ontological models. Such approaches work in a range of (limited) scenarios, and even if the trend is to automate as much in the process, is not what programming is about. Each such tool comes with its own limitations that sooner or later will push back. Changing the code in order to build new functionality or to optimize the code is often not a feasible solution as it imposes further limitations.

Programming is not only about writing code. It involves also problem-solving abilities, having a certain understanding about the business processes, in which the conceptual creativity and ingenuity of design can prove to be a good asset. Modelling and implementing processes help programmers gain a unique perspective within a business.

For a programmer the learning process never stops. The release cycle for the known tools becomes smaller, each release bringing a new set of functionalities. Moreover, there are always new frameworks, environments, architectures and methodologies to learn. There’s a considerable amount of effort in expanding one's (necessary) knowledge, effort usually not planned in projects or outside of them. Trainings help in the process, though they hardly scratch the surface. Often the programmer is forced to fill the knowledge gap in his free time. This adds up to the volume of overtime one must do on projects. On the long run it becomes challenging to find the needed time for learning.

In resource planning there’s the tendency to add or replace resources on projects, while neglecting the influence this might have on a project and its timeline. Each new resource needs some time to accommodate himself on the role, to understand project requirements, to take over the work of another. Moreover, resources are replaced on project with a minimal or even without the knowledge transfer necessary for the job ahead. Unfortunately, same behavior occurs in consultancy as well, consultants being moved from one known functional area into another unknown area, changing the resources like the engines of different types of car, expecting that everything will work as magic.

10 May 2019

Data Warehousing: Data Warehousing and Microsoft Dynamics 365

Data Warehousing

With Dynamics 365 (D365) Online Microsoft made an important strategical move on the ERP market, however in what concerns the BI & Data Warehousing (BI/DW) area Microsoft changed the rules of the game by allowing no direct SQL access to the production environment. This primarily means that will become challenging for organizations to use the existing DW infrastructure to access the D365 data, and for Vendors and Service Providers to provide BI/DW solutions integrated within the D365 platform.

D365 includes its own data warehouse (actually data mart) designed for financial reporting however as per now it can’t be extended to support other business areas. The solution favorited by Microsoft for DW seems to be the use of an Azure SQL Database aka BYOD (Bring Your Own Database) to which entity-based data can be exported incrementally (aka incremental push) or fully (aka full push) via the Data Management Framework (DMF) packages.

Because many of the D365 tables (e.g. Inventory Transactions, Products, Customers, Vendors) were overnormalized over the years and other tables were added as part of new functionality, to hide this complexity, Microsoft introduced a new layer of abstraction formed from data entities organized within an entity store. Data entities are view-like encapsulations of the underlying D365 table schema, the data import/export from and D365 being performed extensively over these data entities via the DMF, which extends the Data Import/Export Framework (DIXF).

One can use thus a BYOD as a direct source for other reporting tools as long they support a connection to Azure, otherwise the data can be further loaded into a database into the cloud, which seems to be the best option until now, as long the organization has other data that need to be consolidated for reporting. From here on, one deals with the traditional way of reporting and the available infrastructure can be extended to use an additional data source.

The BYOD solution comes with several restrictions: a package needs to be created for each business unit, no composite data entities can be exported, data entities that don’t have a unique key can’t be exported via an incremental push, data entities can change over times (new versions being available), while during synchronization no active locks should be on the database. In addition, organizations which followed this path report also some bugs that needed to be addressed via the Microsoft support. Even if the about 1700 available data entities facilitate to some degree data consumption, they seem to be more appropriate for data migrations and data integrations than for DW workloads.

In absence of direct SQL connectivity, in theory organizations can still use SSIS or similar integration tools to connect to D365 production databases and consume data entities via the Open Data Protocol (OData), a standard that defines a set of best practices for building and consuming RESTful APIs. Besides some architectural challenges, loading big tables with transactional data is reportedly slow and impracticable for loading a data warehouse. Therefore, the usability of such an architecture becomes limited in time.

Microsoft imposed a hard limitation upon its D365 architecture by making its production database inaccessible. Of course, there’s still time for Microsoft to do some magic and pull new solutions from the technology stack hat. Unfortunately, the constraints imposed to the production environments limit organizations’ choices of building a modern and flexible data warehouse. For the future it would be great if the DMF could be used directly with standard SQL Server databases, avoiding thus the need for the intermediary Azure database, or if a real-time operational solution could be provided out-of-the-box. We’ll see what the future brings...

07 May 2019

Business Intelligence: How Big Is Your Report?

Business Intelligence

How big are your reports? How big reports needs to be? Do your reports really reflect your needs? Have they become too cluttered with data? Do you have too many reports on the same topic? How many is too many? These are the few of the questions BI developers and users should ask themselves altogether from time to time.

A report is any document with textual and/or graphical formatted output of data from one or more data sources, (previously) designed to convey a basis for decision making or operational activities. A report is characterized by the amount of data it holds (the datasets), the amount of data is based on (the source data), the number and complexity of the queries on which the report is based, the number of data sources, the manner in which data are structured (tabular, matrix, graphical), the filtering and sorting possibilities, as well by the navigability possibilities (drilldown, drill-through, slice-and-dice, etc.). 

On the other side for users are important characteristics like reports’ performance, the amount of useful information it conveys, the degree to which a report helps address a business need, the quality of data, the degree to which it satisfies the various policies, the look and feel, the possibility of exporting the data to standard file formats.

A report’s size is defined typically by the product of columns and records the report displays plus the formatting and various types of graphical content, however this depends on the filter criteria used by the user. Usually is considered the average size of a report based on the typical filters used. Nowadays networks and database specific techniques allow displaying fairly big reports (20-50 Mb) in a fairly amount of time (10-20 seconds) without affecting network, respectively database’s performance, which for most of the requests should be enough. When the users need bigger volumes of data then a direct data dump (extract) from the database should be considered, when possible. (A data export is not a report and they should be differentiated as such.)

The number of records that could be shown in a report is dependent on reporting framework’s capabilities, e.g. there are reporting tools that cope well with showing a few thousands records but have difficulties in showing or exporting tens of thousands of records. The best example into this respect is Excel and its well-known limitation of 65536 records (2^16)  and 256 columns (2^8) that in the meantime has been addressed in Excel 2007 and enlarged to 1 million records (2^20), respectively 16k (2^14). Even so the reporting tools that use older drivers can fail exporting all the data to Excel when the former limitation is reached.

In general, reports with too many columns tend to obfuscate data’s understanding and are more difficult to navigate. The more the user needs to scroll horizontally the higher in general the obfuscation. If the users really need 50 columns then they should be provided, however in general 20-25 should be enough for an operational report. Tactical and strategic reports need a restrained focus and the information should be provided in a screen without the need of scrolling.

When reports get too big is recommended to split the reports in two or more reports to address specific requirements, however this can lead to too many distinct reports, and further to duplication of effort for creating and documenting them, and the duplication of logic and data. Therefore, the challenge is to find the right balance between the volume of reports, their usability and the effort needed to manage them. In certain scenarios it makes even sense to consolidate similar reports.

22 April 2019

Project Management: The Choice of Tools in Project Management

Mismanagement

Beware the man of one book” (in Latin, “homo unius libri”), a warning generally attributed to Thomas Aquinas and having a twofold meaning. In its original interpretation it was referring to the people mastering a single chosen discipline, however the meaning degenerated in expressing the limitations of people who master just one book, and thus having a limited toolset of perspectives, mental models or heuristics. This later meaning is better reflected in Abraham Maslow adage: “If the only tool you have is a hammer, you tend to see every problem as a nail”, as people tend to use the tools they are used to also in situations in which other tools are more appropriate.

It’s sometimes admirable people and even organizations’ stubbornness in using the same tools in totally different scenarios, expecting though the same results, as well in similar scenarios expecting different results. It’s true, Mathematics has proven that the same techniques can be used successfully in different areas, however a mathematician’s universe and models are idealistically fractionalized to a certain degree from reality, full of simplified patterns and never-ending approximations. In contrast, the universe of Software Development and Project Management has a texture of complex patterns with multiple levels of dependencies and constraints, constraints highly sensitive to the initial conditions.

Project Management has managed to successfully derive tools like methodologies, processes, procedures, best practices and guidelines to address the realities of projects, however their use in praxis seems to be quite challenging. Probably, the challenge resides in stubbornness of not adapting the tools to the difficulties and tasks met. Even if the same phases and multiple similarities seems to exist, the process of building a house or other tangible artefact is quite different than the approaches used in development and implementation of software.

Software projects have high variability and are often explorative in nature. The end-product looks totally different than the initial scaffold. The technologies used come with opportunities and limitations that are difficult to predict in the planning phase. What on paper seems to work often doesn’t work in praxis as the devil lies typically in details. The challenges and limitations vary between industries, businesses and even projects within the same organization.

Even if for each project type there’s a methodology more suitable than another, in the end project particularities might pull the choice in one direction or another. Business Intelligence projects for example can benefit from agile approaches as they enable to better manage and deliver value by adapting the requirements to business needs as the project progresses. An agile approach works almost always better than a waterfall process. In contrast, ERP implementations seldom benefit from agile methodologies given the complexity of the project which makes from planning a real challenge, however this depends also on an organization’s dynamicity.
Especially when an organization has good experience with a methodology there’s the tendency to use the same methodology across all the projects run within the organization. This results in chopping down a project to fit an ideal form, which might be fine as long the particularities of each project are adequately addressed. Even if one methodology is not appropriate for a given scenario it doesn’t mean it can’t be used for it, however in the final equation enter also the cost, time, effort, and the quality of the end-results.
In general, one can cope with complexity by leveraging a broader set of mental models, heuristics and set of tools, and this can be done only though experimentation, through training and exposing employees to new types of experiences, through openness, through adapting the tools to the challenges ahead.

28 October 2018

Data Science: Limits (Just the Quotes)

"Whatever lies beyond the limits of experience, and claims another origin than that of induction and deduction from established data, is illegitimate." (George H Lewes, "The Foundations of a Creed", 1875)

"It is difficult to understand why statisticians commonly limit their inquiries to Averages, and do not revel in more comprehensive views. Their souls seem as dull to the charm of variety as that of the native of one of our flat English counties, whose retrospect of Switzerland was that, if its mountains could be thrown into its lakes, two nuisances would be got rid of at once. An Average is but a solitary fact, whereas if a single other fact be added to it, an entire Normal Scheme, which nearly corresponds to the observed one, starts potentially into existence." (Sir Francis Galton, "Natural Inheritance", 1889)

"Physical research by experimental methods is both a broadening and a narrowing field. There are many gaps yet to be filled, data to be accumulated, measurements to be made with great precision, but the limits within which we must work are becoming, at the same time, more and more defined." (Elihu Thomson, "Annual Report of the Board of Regents of the Smithsonian Institution", 1899)

"Statistics may, for instance, be called the science of counting. Counting appears at first sight to be a very simple operation, which any one can perform or which can be done automatically; but, as a matter of fact, when we come to large numbers, e.g., the population of the United Kingdom, counting is by no means easy, or within the power of an individual; limits of time and place alone prevent it being so carried out, and in no way can absolute accuracy be obtained when the numbers surpass certain limits." (Sir Arthur L Bowley, "Elements of Statistics", 1901)

"The usefulness of the models in constructing a testable theory of the process is severely limited by the quickly increasing number of parameters which must be estimated in order to compare the predictions of the models with empirical results" (Anatol Rapoport, "Prisoner's Dilemma: A study in conflict and cooperation", 1965)

"A real change of theory is not a change of equations - it is a change of mathematical structure, and only fragments of competing theories, often not very important ones conceptually, admit comparison with each other within a limited range of phenomena." (Yuri I Manin, "Mathematics and Physics", 1981)

"Models are often used to decide issues in situations marked by uncertainty. However statistical differences from data depend on assumptions about the process which generated these data. If the assumptions do not hold, the inferences may not be reliable either. This limitation is often ignored by applied workers who fail to identify crucial assumptions or subject them to any kind of empirical testing. In such circumstances, using statistical procedures may only compound the uncertainty." (David A Greedman & William C Navidi, "Regression Models for Adjusting the 1980 Census", Statistical Science Vol. 1 (1), 1986)

"[…] an honest exploratory study should indicate how many comparisons were made […] most experts agree that large numbers of comparisons will produce apparently statistically significant findings that are actually due to chance. The data torturer will act as if every positive result confirmed a major hypothesis. The honest investigator will limit the study to focused questions, all of which make biologic sense. The cautious reader should look at the number of ‘significant’ results in the context of how many comparisons were made." (James L Mills, "Data torturing", New England Journal of Medicine, 1993)

"In spite of the insurmountable computational limits, we continue to pursue the many problems that possess the characteristics of organized complexity. These problems are too important for our well being to give up on them. The main challenge in pursuing these problems narrows down fundamentally to one question: how to deal with systems and associated problems whose complexities are beyond our information processing limits? That is, how can we deal with these problems if no computational power alone is sufficient?"  (George Klir, "Fuzzy sets and fuzzy logic", 1995)

"The larger, more detailed and complex the model - the less abstract the abstraction – the smaller the number of people capable of understanding it and the longer it takes for its weaknesses and limitations to be found out." (John Adams, "Risk", 1995)

"[...] the NFL theorems mean that if an algorithm does particularly well on average for one class of problems then it must do worse on average over the remaining problems. In particular, if an algorithm performs better than random search on some class of problems then in must perform worse than random search on the remaining problems. Thus comparisons reporting the performance of a particular algorithm with a particular parameter setting on a few sample problems are of limited utility. While such results do indicate behavior on the narrow range of problems considered, one should be very wary of trying to generalize those results to other problems." (David H Wolpert & William G Macready, "No free lunch theorems for optimization", IEEE Transactions on Evolutionary Computation 1 (1), 1997)

"No comparison between two values can be global. A simple comparison between the current figure and some previous value and convey the behavior of any time series. […] While it is simple and easy to compare one number with another number, such comparisons are limited and weak. They are limited because of the amount of data used, and they are weak because both of the numbers are subject to the variation that is inevitably present in weak world data. Since both the current value and the earlier value are subject to this variation, it will always be difficult to determine just how much of the difference between the values is due to variation in the numbers, and how much, if any, of the difference is due to real changes in the process." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"[…] an obvious difference between our best classifiers and human learning is the number of examples required in tasks such as object detection. […] the difficulty of a learning task depends on the size of the required hypothesis space. This complexity determines in turn how many training examples are needed to achieve a given level of generalization error. Thus the complexity of the hypothesis space sets the speed limit and the sample complexity for learning." (Tomaso Poggio & Steve Smale, "The Mathematics of Learning: Dealing with Data", Notices of the AMS, 2003)

"Every number has its limitations; every number is a product of choices that inevitably involve compromise. Statistics are intended to help us summarize, to get an overview of part of the world’s complexity. But some information is always sacrificed in the process of choosing what will be counted and how. Something is, in short, always missing. In evaluating statistics, we should not forget what has been lost, if only because this helps us understand what we still have." (Joel Best, "More Damned Lies and Statistics : How numbers confuse public issues", 2004)

"Statistics depend on collecting information. If questions go unasked, or if they are asked in ways that limit responses, or if measures count some cases but exclude others, information goes ungathered, and missing numbers result. Nevertheless, choices regarding which data to collect and how to go about collecting the information are inevitable." (Joel Best, "More Damned Lies and Statistics: How numbers confuse public issues", 2004)

"The Bayesian approach is based on the following postulates: (B1) Probability describes degree of belief, not limiting frequency. As such, we can make probability statements about lots of things, not just data which are subject to random variation. […] (B2) We can make probability statements about parameters, even though they are fixed constants. (B3) We make inferences about a parameter θ by producing a probability distribution for θ. Inferences, such as point estimates and interval estimates, may then be extracted from this distribution." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"A population that grows logistically, initially increases exponentially; then the growth lows down and eventually approaches an upper bound or limit. The most well-known form of the model is the logistic differential equation." (Linda J S Allen, "An Introduction to Mathematical Biology", 2007)

"Humans have difficulty perceiving variables accurately […]. However, in general, they tend to have inaccurate perceptions of system states, including past, current, and future states. This is due, in part, to limited ‘mental models’ of the phenomena of interest in terms of both how things work and how to influence things. Consequently, people have difficulty determining the full implications of what is known, as well as considering future contingencies for potential systems states and the long-term value of addressing these contingencies. " (William B. Rouse, "People and Organizations: Explorations of Human-Centered Design", 2007)

"The methodology of feedback design is borrowed from cybernetics (control theory). It is based upon methods of controlled system model’s building, methods of system states and parameters estimation (identification), and methods of feedback synthesis. The models of controlled system used in cybernetics differ from conventional models of physics and mechanics in that they have explicitly specified inputs and outputs. Unlike conventional physics results, often formulated as conservation laws, the results of cybernetical physics are formulated in the form of transformation laws, establishing the possibilities and limits of changing properties of a physical system by means of control." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"All graphics present data and allow a certain degree of exploration of those same data. Some graphics are almost all presentation, so they allow just a limited amount of exploration; hence we can say they are more infographics than visualization, whereas others are mostly about letting readers play with what is being shown, tilting more to the visualization side of our linear scale. But every infographic and every visualization has a presentation and an exploration component: they present, but they also facilitate the analysis of what they show, to different degrees." (Alberto Cairo, "The Functional Art", 2011)

"There are limits on the data we can gather and the kinds of experiments we can perform."(Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"Learning theory claims that a machine learning algorithm can generalize well from a finite training set of examples. This seems to contradict some basic principles of logic. Inductive reasoning, or inferring general rules from a limited set of examples, is not logically valid. To logically infer a rule describing every member of a set, one must have information about every member of that set." (Ian Goodfellow et al, "Deep Learning", 2015)

"Science’s predictions are more trustworthy, but they are limited to what we can systematically observe and tractably model. Big data and machine learning greatly expand that scope. Some everyday things can be predicted by the unaided mind, from catching a ball to carrying on a conversation. Some things, try as we might, are just unpredictable. For the vast middle ground between the two, there’s machine learning." (Pedro Domingos, "The Master Algorithm", 2015)

"To make progress, every field of science needs to have data commensurate with the complexity of the phenomena it studies. [...] With big data and machine learning, you can understand much more complex phenomena than before. In most fields, scientists have traditionally used only very limited kinds of models, like linear regression, where the curve you fit to the data is always a straight line. Unfortunately, most phenomena in the world are nonlinear. [...] Machine learning opens up a vast new world of nonlinear models." (Pedro Domingos, "The Master Algorithm", 2015)

"Repeated observations of the same phenomenon do not always produce the same results, due to random noise or error. Sampling errors result when our observations capture unrepresentative circumstances, like measuring rush hour traffic on weekends as well as during the work week. Measurement errors reflect the limits of precision inherent in any sensing device. The notion of signal to noise ratio captures the degree to which a series of observations reflects a quantity of interest as opposed to data variance. As data scientists, we care about changes in the signal instead of the noise, and such variance often makes this problem surprisingly difficult." (Steven S Skiena, "The Data Science Design Manual", 2017)

"Regularization is particularly important when the amount of available data is limited. A neat biological interpretation of regularization is that it corresponds to gradual forgetting, as a result of which 'less important' (i.e., noisy) patterns are removed. In general, it is often advisable to use more complex models with regularization rather than simpler models without regularization." (Charu C Aggarwal, "Neural Networks and Deep Learning: A Textbook", 2018)

"The no free lunch theorems set limits on the range of optimality of any method. That is, each methodology has a ‘catchment area’ where it is optimal or nearly so. Often, intuitively, if the optimality is particularly strong then the effectiveness of the methodology falls off more quickly outside its catchment area than if its optimality were not so strong. Boosting is a case in point: it seems so well suited to binary classification that efforts to date to extend it to give effective classification (or regression) more generally have not been very successful. Overall, it remains to characterize the catchment areas where each class of predictors performs optimally, performs generally well, or breaks down." (Bertrand S Clarke & Jennifer L. Clarke, "Predictive Statistics: Analysis and Inference beyond Models", 2018)

"Unless we’re collecting data ourselves, there’s a limit to how much we can do to combat the problem of missing data. But we can and should remember to ask who or what might be missing from the data we’re being told about. Some missing numbers are obvious […]. Other omissions show up only when we take a close look at the claim in question." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Despite their predictive power, most analytics and data science practices ignore relationships because it has been historically challenging to process them at scale." (Jesús Barrasa et al, "Knowledge Graphs: Data in Context for Responsive Businesses", 2021)

"Visualisation is fundamentally limited by the number of pixels you can pump to a screen. If you have big data, you have way more data than pixels, so you have to summarise your data. Statistics gives you lots of really good tools for this." (Hadley Wickham)

28 November 2016

Strategic Management: Limits (Just the Quotes)

"Weak character coupled with honored place, meager knowledge with large plans, limited powers with heavy responsibility, will seldom escape disaster." ("I Ching" ["Book of Changes"], cca. 600 BC)

"[...] authority for given tasks is limited to that for which an individual may properly held responsible." (Harold Koontz & Cyril O Donnell, "Principles of Management", 1955)

"Another approach to management theory, undertaken by a growing and scholarly group, might be referred to as the decision theory school. This group concentrates on rational approach to decision-the selection from among possible alternatives of a course of action or of an idea. The approach of this school may be to deal with the decision itself, or to the persons or organizational group making the decision, or to an analysis of the decision process. Some limit themselves fairly much to the economic rationale of the decision, while others regard anything which happens in an enterprise the subject of their analysis, and still others expand decision theory to cover the psychological and sociological aspect and environment of decisions and decision-makers." (Harold Koontz, "The Management Theory Jungle," 1961)

"The concept of leadership has an ambiguous status in organizational practice, as it does in organizational theory. In practice, management appears to be of two minds about the exercise of leadership. Many jobs are so specified in content and method that within very broad limits differences among individuals become irrelevant, and acts of leadership are regarded as gratuitous at best, and at worst insubordinate." (Daniel Katz & Robert L Kahn, "The Social Psychology of Organizations", 1966)

"Good mission statements focus on a limited number of goals, stress the company's major policies and values, and define the company's major competitive scopes." (Philip Kotler, "Marketing Management", 1967)

"Taking no action to solve these problems is equivalent of taking strong action. Every day of continued exponential growth brings the world system closer to the ultimate limits of that growth. A decision to do nothing is a decision to increase the risk of collapse." (Donella Meadows et al, "The Limits to Growth", 1972) 

"Leadership is lifting a person's vision to higher sights, the raising of a person's performance to a higher standard, the building of a personality beyond its normal limitations." (Peter Drucker, "Management: Tasks, Responsibilities, Challenges", 1973)

"The greater the uncertainty, the greater the amount of decision making and information processing. It is hypothesized that organizations have limited capacities to process information and adopt different organizing modes to deal with task uncertainty. Therefore, variations in organizing modes are actually variations in the capacity of organizations to process information and make decisions about events which cannot be anticipated in advance." (John K Galbraith, "Organization Design", 1977)

"In business as on the battlefield, the object of strategy is to bring about the conditions most favorable to one's own side, judging precisely the right moment to attack or withdraw and always assessing the limits of compromise correctly. Besides the habit of analysis, what marks the mind of the strategist is an intellectual elasticity or flexibility that enables him to come up with realistic responses to changing situations, not simply to discriminate with great precision among different shades of gray." (Kenichi Ohmae, "The Mind Of The Strategist", 1982)

"The key mission of contemporary management is to transcend the old models which limited the manager's role to that of controller, expert or morale booster. These roles do not produce the desired result of aligning the goals of the employees and the corporation. [...] These older models, vestiges of a bygone era, have served their function and must be replaced with a model of the manager as a developer of human resources." (Michael Durst, "Small Systems World", 1985)

"[…] new insights fail to get put into practice because they conflict with deeply held internal images of how the world works [...] images that limit us to familiar ways of thinking and acting. That is why the discipline of managing mental models - surfacing, testing, and improving our internal pictures of how the world works - promises to be a major breakthrough for learning organizations." (Peter Senge, "The Fifth Discipline: The Art and Practice of the Learning Organization", 1990)

"A process perspective sees not individual tasks in isolation, but the entire collection of tasks that contribute to a desired outcome. Narrow points of view are useless in a process context. It just won't do for each person to be concerned exclusively with his or her own limited responsibility, no matter how well these responsibilities are met. When that occurs, the inevitable result is working at cross–purpose, misunderstanding, and the optimization of the part at the expense of the whole. Process work requires that everyone involved be directed toward a common goal; otherwise, conflicting objectives and parochial agendas impair the effort."  (James A Champy & Michael M Hammer, "Reengineering the Corporation", 1993)

"Strategy renders choices about what not to do as important as choices about what to do. Indeed, setting limits is another function of leadership. Deciding which target group of customers, varieties, and needs the company should serve is fundamental to developing a strategy. But so is deciding not to serve other customers or needs and not to offer certain features or services. Thus strategy requires constant discipline and clear communication. Indeed, one of the most important functions of an explicit, communicated strategy is to guide employees in making choices that arise because of trade-offs in their individual activities and in day-to-day decisions." (Michael E Porter, "What is Strategy?", Harvard Business Review, 1996)

"Ideas about organization are always based on implicit images or metaphors that persuade us to see, understand, and manage situations in a particular way. Metaphors create insight. But they also distort. They have strengths. But they also have limitations. In creating ways of seeing, they create ways of not seeing. There can be no single theory or metaphor that gives an all-purpose point of view, and there can be no simple 'correct theory' for structuring everything we do." (Gareth Morgan, "Imaginization", 1997)

"Faced with the overwhelming complexity of the real world, time pressure, and limited cognitive capabilities, we are forced to fall back on rote procedures, habits, rules of thumb, and simple mental models to make decisions. Though we sometimes strive to make the best decisions we can, bounded rationality means we often systematically fall short, limiting our ability to learn from experience." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Strategy is about stretching limited resources to fit ambitious aspirations." (Coimbatore K Prahalad, "Don Soderquist", 2005)

"Leadership is seeing the possibilities in a situation while others are seeing the limitations." (John C Maxwell, "The 21 Irrefutable Laws of Leadership", 2007)

"The other element of systems thinking is learning to influence the system with reinforcing feedback as an engine for growth or decline. [...] Without this kind of understanding, managers will hit blockages in the form of seeming limits to growth and resistance to change because the large complex system will appear impossible to manage. Systems thinking is a significant solution." (Richard L Daft, "The Leadership Experience" 4th Ed., 2008)

"Good mission statements have five major characteristics. (1) They focus on a limited number of goals. (2) They stress the company’s major policies and values. (3) They define the major competitive spheres within which the company will operate. (4) They take a long-term view. (5) They are as short, memorable, and meaningful as possible." (Philip Kotler & Kevin L Keller, "Marketing Management" 15th Ed., 2016)

14 December 2014

Systems Engineering: Exponential Growth (Just the Quotes)

"However, and conversely, our models fall far short of representing the world fully. That is why we make mistakes and why we are regularly surprised. In our heads, we can keep track of only a few variables at one time. We often draw illogical conclusions from accurate assumptions, or logical conclusions from inaccurate assumptions. Most of us, for instance, are surprised by the amount of growth an exponential process can generate. Few of us can intuit how to damp oscillations in a complex system." (Donella H Meadows, "Limits to Growth", 1972) 

"Taking no action to solve these problems is equivalent of taking strong action. Every day of continued exponential growth brings the world system closer to the ultimate limits of that growth. A decision to do nothing is a decision to increase the risk of collapse." (Donella Meadows et al, "The Limits to Growth", 1972) 

"Every day of continued exponential growth brings the world system closer to the ultimate limits of that growth." (Mihajlo D Mesarovic, "Mankind at the Turning Point", 1974)

"It has long been appreciated by science that large numbers behave differently than small numbers. Mobs breed a requisite measure of complexity for emergent entities. The total number of possible interactions between two or more members accumulates exponentially as the number of members increases. At a high level of connectivity, and a high number of members, the dynamics of mobs takes hold. " (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Mathematics says the sum value of a network increases as the square of the number of members. In other words, as the number of nodes in a network increases arithmetically, the value of the network increases exponentially. Adding a few more members can dramatically increase the value of the network." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"It is in the nature of exponential growth that events develop extremely slowly for extremely long periods of time, but as one glides through the knee of the curve, events erupt at an increasingly furious pace. And that is what we will experience as we enter the twenty-first century." (Ray Kurzweil, "The Age of Spiritual Machines: When Computers Exceed Human Intelligence", 1999)

"Periods of rapid change and high exponential growth do not, typically, last long. A new equilibrium with a new dominant technology and/or competitor is likely to be established before long. Periods of punctuation are therefore exciting and exhibit unusual uncertainty. The payoff from establishing a dominant position in this short time is therefore extraordinarily high. Dominance is more likely to come from skill in marketing and positioning than from superior technology itself." (Richar Koch, "The Power Laws", 2000)

"There is a strong tendency today to narrow specialization. Because of the exponential growth of information, we can afford (in terms of both economics and time) preparation of specialists in extremely narrow fields, the various branches of science and engineering having their own particular realms. As the knowledge in these fields grows deeper and broader, the individual's field of expertise has necessarily become narrower. One result is that handling information has become more difficult and even ineffective." (Semyon D Savransky, "Engineering of Creativity", 2000)

"Evolution moves towards greater complexity, greater elegance, greater knowledge, greater intelligence, greater beauty, greater creativity, and greater levels of subtle attributes such as love. […] Of course, even the accelerating growth of evolution never achieves an infinite level, but as it explodes exponentially it certainly moves rapidly in that direction." (Ray Kurzweil, "The Singularity is Near", 2005)

"The first idea is that human progress is exponential (that is, it expands by repeatedly multiplying by a constant) rather than linear (that is, expanding by repeatedly adding a constant). Linear versus exponential: Linear growth is steady; exponential growth becomes explosive." (Ray Kurzweil, "The Singularity is Near", 2005)

"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly.  A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"A characteristic of such chaotic dynamics is an extreme sensitivity to initial conditions (exponential separation of neighboring trajectories), which puts severe limitations on any forecast of the future fate of a particular trajectory. This sensitivity is known as the ‘butterfly effect’: the state of the system at time t can be entirely different even if the initial conditions are only slightly changed, i.e., by a butterfly flapping its wings." (Hans J Korsch et al, "Chaos: A Program Collection for the PC", 2008)

"A quantity growing exponentially toward a limit reaches that limit in a surprisingly short time." (Donella Meadows, "Thinking in systems: A Primer", 2008)

"In physical, exponentially growing systems, there must be at least one reinforcing loop driving growth and at least one balancing feedback loop constraining growth, because no system can grow forever in a finite environment." (Donella H Meadows, “Thinking in Systems: A Primer”, 2008)

"[...] a high degree of unpredictability is associated with erratic trajectories. This not only because they look random but mostly because infinitesimally small uncertainties on the initial state of the system grow very quickly - actually exponentially fast. In real world, this error amplification translates into our inability to predict the system behavior from the unavoidable imperfect knowledge of its initial state." (Massimo Cencini et al, "Chaos: From Simple Models to Complex Systems", 2010)

"In chaotic deterministic systems, the probabilistic description is not linked to the number of degrees of freedom (which can be just one as for the logistic map) but stems from the intrinsic erraticism of chaotic trajectories and the exponential amplification of small uncertainties, reducing the control on the system behavior." (Massimo Cencini et al, "Chaos: From Simple Models to Complex Systems", 2010)

"Standard economists don't seem to understand exponential growth. Ecological economics recognizes that the economy, like any other subsystem on the planet, cannot grow forever. And if you think of an organism as an analogy, organisms grow for a period and then they stop growing. They can still continue to improve and develop, but without physically growing, because if organisms did that you’d end up with nine-billion-ton hamsters." (Robert Costanza, "What is Ecological economics", 2010)

"Cyberneticists argue that positive feedback may be useful, but it is inherently unstable, capable of causing loss of control and runaway. A higher level of control must therefore be imposed upon any positive feedback mechanism: self-stabilising properties of a negative feedback loop constrain the explosive tendencies of positive feedback. This is the starting point of our journey to explore the role of cybernetics in the control of biological growth. That is the assumption that the evolution of self-limitation has been an absolute necessity for life forms with exponential growth." (Tony Stebbing, "A Cybernetic View of Biological Growth: The Maia Hypothesis", 2011)

"Exponentially growing systems are prevalent in nature, spanning all scales from biochemical reaction networks in single cells to food webs of ecosystems. How exponential growth emerges in nonlinear systems is mathematically unclear. […] The emergence of exponential growth from a multivariable nonlinear network is not mathematically intuitive. This indicates that the network structure and the flux functions of the modeled system must be subjected to constraints to result in long-term exponential dynamics." (Wei-Hsiang Lin et al, "Origin of exponential growth in nonlinear reaction networks", PNAS 117 (45), 2020) 

More quotes on "Exponential Growth" at the-web-of-knowledge.blogspot.com.

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.