Showing posts with label issues. Show all posts
Showing posts with label issues. Show all posts

11 September 2024

Data Management: Data Culture (Part IV: Quo vadis? [Where are you going?])

Data Management Series

The people working for many years in the fields of BI/Data Analytics, Data and Process Management probably met many reactions that at the first sight seem funny, though they reflect bigger issues existing in organizations: people don’t always understand the data they work with, how data are brought together as part of the processes they support, respectively how data can be used to manage and optimize the respective processes. Moreover, occasionally people torture the data until it confesses something that doesn’t necessarily reflect the reality. It’s even more deplorable when the conclusions are used for decision-making, managing or optimizing the process. In extremis, the result is an iterative process that creates more and bigger issues than whose it was supposed to solve. 

Behind each blunder there are probably bigger understanding issues that need to be addressed. Many of the issues revolve around understanding how data are created, how are brought together, how the processes work and what data they need, use and generate. Moreover, few business and IT people look at the full lifecycle of data and try to optimize it, or they optimize it in the wrong direction. Data Management is supposed to help, and it does this occasionally, though a methodology, its processes and practices are as good as people’s understanding about data and its use! No matter how good a data methodology is, it’s as weak as the weakest link in its use, and typically the issues revolving around data and data understanding are the weakest link. 

Besides technical people, few businesspeople understand the full extent of managing data and its lifecycle. Unfortunately, even if some of the topics are treated in the books, they are too dry, need hands on experience and some thought in corroborating practices with theories. Without this, people will do things mechanically, processes being as good as the people using them, their value becoming suboptimal and hinder the business. That’s why training on Data Management is not enough without some hands-on experience!

The most important impact is however in BI/Data Analytics areas - how the various artifacts are created and used as support in decision-making, process optimization and other activities rooted in data. Ideally, some KPIs and other metrics should be enough for managing and directing a business, however just basing the decisions on a set of KPIs without understanding the bigger picture, without having a feeling of the data and their quality, the whole architecture, no matter how splendid, can breakdown as sandcastle on a shore meeting the first powerful wave!

Sometimes it feels like organizations do things from inertia, driven by the forces of the moment, initiatives and business issues for which temporary and later permanent solutions are needed. The best chance for solving many of the issues would have been a long time ago, when the issues were still small to create any powerful waves within the organizations. Therefore, a lot of effort is sometimes spent in solving the consequences of decisions not made at the right time, and that can be painful and costly!

For building a good business one needs also a solid foundation. In the past it was enough to have a good set of products that are profitable. However, during the past decade(s) the rules of the game changed driven by the acerb competition across geographies, inefficiencies, especially in the data and process areas, costing organizations on the short and long term. Data Management in general and Data Quality in particular, even if they’re challenging to quantify, have the power to address by design many of the issues existing in organizations, if given the right chance.

Previous Post <<||>> Next Post

21 August 2024

Business Intelligence: Data Modeling (Part IV: From Data to Storytelling II)

Business Intelligence Series

Being snapshots in people and organizations’ lives, data arrive to tell a story, even if the story might not be worth telling or might be important only in certain contexts. In fact each record in a dataset has the potential of bringing a story to life, though business people are more interested in the hidden patterns and “stories” the data reveal through more or less complex techniques. Therefore, data are usually tortured until they confess something, and unfortunately people stop analyzing the data with the first confession(s). 

Even if it looks like torture, data need to be processed to reveal certain characteristics, trends or patterns that could help us in sense-making, decision-making or similar specific business purposes. Unfortunately, the volume of data increases with an incredible velocity to which further characteristics like variety, veracity, volume, velocity, value, veracity and variability may add up. 

The data in a dashboard, presentation or even a report should ideally tell a story otherwise the data might not be worthy looking at, at least from some people’s perspective. Probably, that’s one of the reason why man dashboards remain unused shortly after they were made available, even if considerable time and money were invested in them. Seeing the same dull numbers gives the illusion that nothing changed, that nothing is worth reviewing, revealing or considering, which might be occasionally true, though one can’t take this as a rule! Lot of important facts could remain hidden or not considered. 

One can suppose that there are businesses in which something important seldom happens and an alert can do a better job than reviewing a dashboard or a report frequently. Probably an alert is a better choice than reporting metrics nobody looks at! 

Organizations usually define a set of KPIs (key performance indicators) and other types of metrics they (intend to) review periodically. Ideally, the numbers collected should define and reflect the critical points (aka pain points) of an organization, if they can be known in advance. Unfortunately, in dynamic businesses the focus can change considerably from one day to another. Moreover, in systemic contexts critical points can remain undiscovered in time if the set of metrics defined doesn’t consider them adequately. 

Typically only one’s experience and current or past issues can tell what one should consider or ignore, which are the critical/pain points or important areas that must be monitored. Ideally, one should implement alerts for the critical points that require a immediate response and use KPIs for the recurring topics (though the two approaches may overlap). 

Following the flow of goods, money and other resources one can look at the processes and identify the areas that must be monitored, prioritize them and identify the metrics that are worth tracking, respectively that reflect strengths, weaknesses, opportunities, threats and the risks associated with them. 

One can start with what changed by how much, what caused the change(s) and what further impact is expected directly or indirectly, by what magnitude, respectively why nothing changed in the considered time unit. Causality diagrams can help in the process even if the representations can become quite complex. 

The deeper one dives and the more questions one attempts to answer, the higher the chances to find a story. However, can we find a story that’s worth telling in any set of data? At least this is the point some adepts of storytelling try to make. Conversely, the data can be dull, especially when one doesn’t track or consider the right data. There are many aspects of a business that may look boring, and many metrics seem to track the boring but probably important aspects. 

21 March 2021

Strategic Management: The Impact of New Technologies III (Checking the Vital Signs)

Strategic Management

An organization which went through a major change, like the replacement of a strategic system (e.g. ERP/BI implementations), needs to go through a period of attentive supervision to address the inherent issues that ideally need to be handled as they arise, to minimize their future effects. Some organizations might even go through a convalescence period, which risks to prolong itself if the appropriate remedies aren’t found. Therefore, one needs an entity, who/which has the skills to recognize the symptoms, understand what’s happening and why, respectively of identifying the appropriate actions.

Given technologies’ multi-layered complexity and the volume of knowledge for understanding them, the role of the doctor can be seldom taken by one person. Moreover, the patient is an organization, each person in the organization having usually local knowledge about the patient. The needed knowledge is dispersed trough the organization, and one needs to tap into that knowledge, identify the people close to technologies and business area, respectively allow such people exchange information on a regular basis.

The people who should know the best the organization are in theory the management, however they are usually too far away from technologies and often too busy with management topics. IT professionals are close to technologies, though sometimes too far away from the patient. The users have a too narrow overview, while from logistical and economic reasons the number of people involved should be kept to a minimum. A compromise is to designate one person from each business area who works with any of the strategic systems, and assure that they have the technical and business knowledge required. It’s nothing but the key-user concept, though for it to work the key-users need not only knowledge but also the empowerment to act when the symptoms appear.

Big organizations have also a product owner for each application who supervises the application through its entire lifecycle, and who needs to coordinate with the IT, business and service providers. This is probably a good idea in order to assure that the ROI is reached over time, respectively that the needs of the system are considered within the IT operation context. In small organizations, the role can be taken by a technical or a business resource with deeper skills then the average user, usually a key-user. However, unless joined with the key-user role, the product owner’s focus will be the product and seldom the business themes.

The issues that need to be overcome after major changes are usually cross-functional, being imperative for people to work together and find solutions. Unfortunately, it’s also in human nature to wait until the issues are big enough to get the proper attention. Unless the key-users have the time allocated already for such topics, the issues will be lost in the heap of operational and tactical activities. This time must be allocated for all key-users and the technical resources needed to support them.

Some organizations build temporary working parties (groups of experts working together to achieve specific goals) or similar groups. However, the statute of such group needs to be permanent if the organization wants to continuously have its health in check, to build the needed expertize and awareness about occurred or potential issues. Centers of excellence/expertize (CoE) or competency centers (CC) are such working groups with permanent statute, having defined roles, responsibilities, and processes for supporting and promoting the effective use of technologies within the organization, respectively of monitoring and systematically addressing the risks and opportunities associated with them.

There’s also the null hypothesis, doing nothing, relying solely on employees’ professionalism, though without defined responsibility, accountability and empowerment, it can get messy.

Previous Post <<||>> Next Post

21 June 2020

SSRS (& Paginated Reports): Poor Design of Parameters

Introduction

The handling of parameters in SSRS (and Paginated Reports) can become a problem for reports' usability when certain beast practices are not considered for populating the controls behind the parameters. This post attempts to address some common cases.

Dropdowns Constrained on the Source Query 

Typically the values needed in dropdown parameters are stored in tables or are predefined like in the previous post though it's not always the case. Supposing that there's no table to store the countries, I have seen reports in which the developers identified the values by using queries like the following:

-- Countries
SELECT DISTINCT SIC.CountryRegionCode Id
, SIC.CountryRegionName Name 
FROM Sales.vIndividualCustomer SIC 

Just imagine that the above query has hundred of millions of records. Even if the query was optimized, it might take minutes to run. 

It is true that the query limits the displayed values only to the ones used, however as soon the volume of data increases this can slow down the report considerably to the degree that the reports becomes unusable. Unfortunately, some developers favorize this approach also when there is a table available. Just imagine that to run a report with complex logic the logic will be run a few times to populate the selection controls for parameters before actually being able to run the report. This slowly puts a burden on the source system and when similar reports are run, this can bring the source system to its knees. In extreme situations they might not be able to run the report at all (e,g., when the report timeouts). 

Building the queries for parameter's population in this way adds also some synchronization issues between the parameters, that are not so easy trackable as the values are not available for selection.

Therefore, to fix this, if no tables are available in the source system for the dropdowns then it's a good idea to create some tables independently of the source system and populate them periodically.  One can do this with a script that runs with data refresh or even maintain the values manually. This works best for data warehouses, but also for OLTP solutions when reports are built on top of the databases behind.

If the users needed indeed only the used values, one can introduce flags in each table which will show whether the value is used or not. 

Too Many Values

If a dropdown has more than a few hundred values then consider having the free text instead of a dropdown. It's much easier to type some values that searching for them in a list. It's true that SSRS lacks a search functionality, which would make such searches easier. One can use in theory a passthrough report, however this has limited functionality. 

Wildcards Everywhere

Implementing wildcards in each text attribute is in general not a good idea. Please note that  the wildcards used in front of a search value doesn't make use of the available indexes. 

Too Many Dependent Dropdowns

There are structures which involve more than two dependencies of dropdowns on one another (e,g,, the user needs to select the Country before selecting a Zip code). Each dropdown implies a roundtrip to the data source and back, and upon case the user has to wait a considerable time until the parameters are filled. This in combination with the first mentioned issues can increase the damage. 

Building such dependencies might still be a good idea when compared with the alternative, showing in the dependent control all the values available. 

Use one Table with Dropdown Values

Oracle e-Business Suite and probably other applications store the values for dropdowns in a single table across the system or per business area. That's useful for maintaining the parameters though it can affect the performance when the number of records in such tables is high. Usually it shouldn't be the case, but it happens (e.g., financial dimensions, having values for each supported language).

Populating the dropdowns should happens with minimal wait, otherwise the users will complain. The retrieval from the database might be fast, though one needs to consider also how long it takes rendering the data, especially when searching for a specific value. In many cases, the table(s) behind should be optimized for the purpose. On the other hand, having individual tables for the dropdowns with many values could be a better approach. 

Frankly, this way of storing the dropdown values has a bigger impact when the tables are involved in joins.

Hardcoding Values for Dropdowns

Instead of hardcoding the same values used in dropdown across multiple reports one should consider using a table, misusing a view or even stored procedure for storing the respective values. The overhead of an additional roundtrip to the database bight be negligible when considering the overhead of maintaining the values across multiple reports. 

Populating the dropdowns through stored procedures could upon case offer better performance in the detriment of usability (e,g,, being able to select a subset of the data without using parameters). Besides the Ids and Names needed to populate the parameter(s), one can maintain further attributes that can be further used in filtering. 


Post reviewed on 29-Sep-2023


28 November 2014

Systems Engineering: Issues (Just the Quotes)

"[Disorganized complexity] is a problem in which the number of variables is very large, and one in which each of the many variables has a behavior which is individually erratic, or perhaps totally unknown. However, in spite of this helter-skelter, or unknown, behavior of all the individual variables, the system as a whole possesses certain orderly and analyzable average properties. [...] [Organized complexity is] not problems of disorganized complexity, to which statistical methods hold the key. They are all problems which involve dealing simultaneously with a sizable number of factors which are interrelated into an organic whole. They are all, in the language here proposed, problems of organized complexity." (Warren Weaver, "Science and Complexity", American Scientist Vol. 36, 1948)

"The fundamental problem today is that of organized complexity. Concepts like those of organization, wholeness, directiveness, teleology, and differentiation are alien to conventional physics. However, they pop up everywhere in the biological, behavioral and social sciences, and are, in fact, indispensable for dealing with living organisms or social groups. Thus a basic problem posed to modern science is a general theory of organization. General system theory is, in principle, capable of giving exact definitions for such concepts and, in suitable cases, of putting them to quantitative analysis." (Ludwig von Bertalanffy, "General System Theory", 1968)

"Technology can relieve the symptoms of a problem without affecting the underlying causes. Faith in technology as the ultimate solution to all problems can thus divert our attention from the most fundamental problem - the problem of growth in a finite system." (Donella A Meadows, "The Limits to Growth", 1972)

"When a mess, which is a system of problems, is taken apart, it loses its essential properties and so does each of its parts. The behavior of a mess depends more on how the treatment of its parts interact than how they act independently of each other. A partial solution to a whole system of problems is better than whole solutions of each of its parts taken separately." (Russell L Ackoff, "The future of operational research is past", The Journal of the Operational Research Society Vol. 30 (2), 1979)

"The world is a complex, interconnected, finite, ecological–social–psychological–economic system. We treat it as if it were not, as if it were divisible, separable, simple, and infinite. Our persistent, intractable global problems arise directly from this mismatch." (Donella Meadows,"Whole Earth Models and Systems", 1982)

"The real leverage in most management situations lies in understanding dynamic complexity, not detail complexity. […] Unfortunately, most 'systems analyses' focus on detail complexity not dynamic complexity. Simulations with thousands of variables and complex arrays of details can actually distract us from seeing patterns and major interrelationships. In fact, sadly, for most people 'systems thinking' means 'fighting complexity with complexity', devising increasingly 'complex' (we should really say 'detailed') solutions to increasingly 'complex' problems. In fact, this is the antithesis of real systems thinking." (Peter M Senge, "The Fifth Discipline: The Art and Practice of the Learning Organization", 1990)

"Systemic problems trace back in the end to worldviews. But worldviews themselves are in flux and flow. Our most creative opportunity of all may be to reshape those worldviews themselves. New ideas can change everything." (Anthony Weston, "How to Re-Imagine the World", 2007)

"All forms of complex causation, and especially nonlinear transformations, admittedly stack the deck against prediction. Linear describes an outcome produced by one or more variables where the effect is additive. Any other interaction is nonlinear. This would include outcomes that involve step functions or phase transitions. The hard sciences routinely describe nonlinear phenomena. Making predictions about them becomes increasingly problematic when multiple variables are involved that have complex interactions. Some simple nonlinear systems can quickly become unpredictable when small variations in their inputs are introduced." (Richard N Lebow, "Forbidden Fruit: Counterfactuals and International Relations", 2010)

"The problem of complexity is at the heart of mankind's inability to predict future events with any accuracy. Complexity science has demonstrated that the more factors found within a complex system, the more chances of unpredictable behavior. And without predictability, any meaningful control is nearly impossible. Obviously, this means that you cannot control what you cannot predict. The ability ever to predict long-term events is a pipedream. Mankind has little to do with changing climate; complexity does." (Lawrence K Samuels, "The Real Science Behind Changing Climate", 2014)

"Because the perfect system cannot be designed, there will always be weak spots that human ingenuity and resourcefulness can exploit." (Paul Gibbons, "The Science of Successful Organizational Change",  2015)


25 March 2010

Business Intelligence: A Reporting Guy’s Issues

Business Intelligence
Business Intelligence Series

Introduction

For more than 6 years, between many other tasks (software development, data migrations, support), I was also the "reporting guy", taking care of ad-hoc reporting requirements, building a data warehouse and a reporting solution for the customer I worked for. 99% of the reports were based on the two ERP systems (Oracle e-Business Suite & IFS-iV) the customer had in place during this time, fact that helped me learn a lot about the data architecture of such systems, about processes, data, data quality and many other issues related to data, how to do and how not do things. In this post I just want to highlight some of the issues I was confronted with, and I don’t intend to point the finger at anybody, so I apologize if anybody is offended!

The Lack of Knowledge about the Business

Even if it’s hard to believe, the main issue was revolving around the lack of relevant documentation on applications, especially on database models and processes, or, even if there were such documents, they were not updated and the value of true of the information contained were supposed to be always checked against the data. Of course, there are always knowledge workers from whom valuable information could be elicited though they were not always available and many of them are highly specialized in their field. Therefore, one needs to interview multiple users to build a close to complete picture, and even then, one must check the newly acquired information against the data! From time to time, one may even find out that the newly acquired information doesn’t entirely match the reality, that there are always exceptions and (business) rules forgotten or not known

Sometimes, it’s easier to derive knowledge directly from the data, table structure and other developers’ experience (e.g. blogs, books, forums) rather than hunting down the knowledge workers. Things aren't that bad, despite the reengineering part, in the end one manages to get the job done, though it takes more time. Sometimes it took even 2-3 more time to accomplish a task, time for which one could found better use. However, in time, accumulating more experience, I become proactive by exploring (and mapping) the unknown "territories" in the breaks between tasks, a fact that allowed me to easier fulfill users’ reporting requests.

Oracle e-Business Suite  

During the past 3 years I have been supporting mainly Oracle e-Business Suite (EBS) users with reports and knowledge about the system, and therefore most of the issues were related to it. In addition to its metadata system implemented in system table structure, Oracle tried to build an external metadata system for its ERP system, namely Metalink (I think it was replaced last year), though there were/are still many gaps in documentation. It’s true that many such gaps derive from the customizations performed when implementing the ERP system, though I would estimate that they qualify only for 20% of the gaps and refer mainly to the Flex Fields (customer-defined fields) used for the various purposes. 

A second important issue was related to the Oracle database engine, were several bugs not patched that didn’t allowed me to use SQL ANSI 92 syntax for linking more than 6-7 tables to a parent table, fact that made me abuse of inline views to solve this issue; even if Oracle had for long a patch to address this, it wasn’t deployed by admins, maybe from well supported reasons. A third issue was related to the different naming conventions used for the same attributes from the source system, mainly a result of the fact that solutions brought from other bought vendors were integrated with a minimum of changes. A fourth issue is related to the poor UI and navigation, basic if we consider the advances made in web technologies during the past years. Conversely, given the complexity of an ERP system, it’s challenging to change the UI at this scale.

 Self-Service BI

There's the belief that everybody can write a query and/or of an ad-hoc report. It’s true that writing a query is a simple task and in theory anybody could learn to it without much of effort, though there are other aspects related to Software Engineering and Project Management, respectively related to a data professional's experience than need to be considered. There are aspects like choosing the right data source, right attributes and level of detail, design the query or solution for the best performance (eventually building an index or using database objects that allow better performance) and reuse, use adequate business rules (e.g. ignoring inactive records or special business cases), synchronize the logic with other reports otherwise two people will show the management distinct numbers, mitigate the Data Quality and Deliverables Quality issues, identify the gaps between reports, etc.

In addition, having users create personal solutions instead of using a more standardized approach is quite risky because the result is a web of such siloed solutions over which there's no control from a strategic and/or data security point of view. Conversely, the users need the possibility to analyze the data by themselves (aka self-service BI), to join data coming from multiple sources. Therefore, special focus should be given also to such requirements, though once their reporting needs have been stabilized, they should be moved, if possible, to a more standardized solution.

When multiple developers are approaching reporting requirements they should work as a team and share knowledge not only on the legacy system but also on users’ requirements, techniques used and best practices. Especially when they are dispersed all over the globe, I know it’s difficult to bring cohesion in a team, make people produce deliverables as if they were written by the same person, though not so impossible to achieve with a little effort from all the parties involved. 

Why I’m mentioning this? The problem is that the more variability is introduced in deliverables, greater the risk is to have the quality of deliverables questioned, in time leading to users not adopting a system or preferring to use one resource in the detriment of another. Moreover, must be considered also the effort needed to find the gaps between reports, to modify deliverables to expectations, etc. From this perspective is always a good idea to document at least at minimum all deliverables, detailing the scope and particularities of the respective request. I know that many believe that code is self-explanatory and needs no additional documentation, though when the basic needed documentation is not available, it's occasionally challenging to intuit the context and identify the problem, respectively why a technique or a certain level of detail was preferred, or why some constraints were used. 

Outsourcing  

Outsourcing is a hot topic these days, when in the context of the current economic crisis organizations are forced to reduce the headcount and cut costs, and thus this has inevitably touched also the reporting area. Outsourcing makes sense when the interfaces between service providers and the customers are well designed and implemented. Beyond the many benefits and issues outsourcing approaches come with, people have to consider that for a developer to create a report is needed not only knowledge about the legacy systems and tools used to extract, transform and prepare the data, but also knowledge about the business itself, about users expectations and organization’s culture, the latter two points being difficult to experience in a disconnected and distributed environment. 

Of course, even if delivering the same result and quality is possible as if the developers were onsite, in the end outsourcing implies additional iterations and overwork, the users need to be trained to specify the reporting requirements adequately or a special resource needs to be available to translate the requirements between the parties involved, lot of back-and-forth communication and all the other issues deriving from it. 

Outsourcing makes sense from a reporting perspective, though it might take time to become efficient. Anyway, the decisions for this approach are usually taken at upper management level. From a reporting guy's perspective, if I consider the amount of additional effort and time spent to deliver comparable quality, I will say "No" to an outsourcing model when the time used to build something is just shifted in managing the communication with the outsourcer, writing emails after emails for issues that could have been solved in a 10-minute meeting. Probably the time and money can be invested in other resources that better enable the process. 


Written: Mar-2010, Last Reviewed: Apr-2024

02 December 2008

Business Intelligence: General Issues in Business Intelligence

Business Intelligence
Business Intelligence Series

Introduction

BI projects are noble in intent though many managers and data professionals ignore their implications and prerequisites – data quality (incl. availability), cooperation, maturity, infrastructure, adequate tools and knowledge.

Data Quality

The problem with data starts usually at the source - ERP and other information systems (IS). In theory the system should cover all the basic reporting requirements existing in an enterprise, though that's seldom the case. Therefore, basic reporting needs arrive to be covered by ad-hoc developed tools which often include MS Excel/Access solutions, which are difficult to integrate and manage across organization.

Data Quality (DQ) is maybe the most ignored component in the attempt to build flexible, secure and reliable BI solutions. DQ is based on the validation implemented in source systems and the mechanisms used to cleanse the data before being reported, respectively on the efficiency and effectiveness of existing business processes and best practices.

DQ must be guaranteed for accurate decisions. If the quality is not validated and reviewed periodically, users will be reluctant in using the reports! The reports must be validated as part of the UAT process. Aggregated BI reports need detailed reports that can be used for validation, while the logic and data need to be synchronized accordingly.

The quality of decisions is based on the degree to which data were understood and presented to the decisional factors, though that’s not enough; it's need also a complete perspective, and maybe that’s why some business users prefer to prepare and aggregate data by themselves, the process allowing them in theory to get a deeper understanding of what’s happening.

Cooperation

A BI initiative doesn’t depend only on the effort of a department (usually IT), but on the business as a whole. Unfortunately, the so called partnership is more a theoretical term than a fact, while managers’ and business users' involvement is often suboptimal. 

BI implementations are also dependent on consultants’ skills and the degree to which they understood business’ requirements, on team’s cohesion and other project (management) related prerequisites, respectively on knowledge transfer and training. 

Tools

Most of the BI tools available on the market don’t satisfy all business, respectively users’ requirements. Even if they excel in some features, they lack in others. Usually, more than one BI tool is needed to cover (most of) the requirements. When features are not available, or they are not mature enough, or they are difficult to learn, users will prefer to use tools they already know.

Another important consideration is that BI tools rely on data models, often inflexible from the point of the data they provide, lacking integrating additional datasets, algorithms and customizations. The overall requirements need to be considered more recently from the point of cloud computing technologies, which becomes steadily a requirement for nowadays business dynamics. 

Maturity 

Besides the fact that Capability Maturity Models (CMMs) are difficult to implement, organizations lack the knowledge of transforming data into knowledge, respectively in understanding data and evolving it further in wisdom and competitive advantage. 

Most of the fancy words used by salesmen to sell a product don’t become reality overnight. Of course, a BI tool might have the potentiality of fulfilling the various technical and nontechnical goals, though between a theoretical potentiality and harnessing the respective potential is a long road that need to be addressed at strategical, tactical and operational levels.

Infrastructure

Infrastructure refers to human and technical components and the way they interact in getting the job done. It's not only about "breaking habits" and using the best tools, but in aligning people and technologies to the desired level of performance, of retaining and diffusing knowledge. 

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.