Showing posts with label methodologies. Show all posts
Showing posts with label methodologies. Show all posts

11 September 2024

🗄️Data Management: Data Culture (Part IV: Quo vadis? [Where are you going?])

Data Management Series

The people working for many years in the fields of BI/Data Analytics, Data and Process Management probably met many reactions that at the first sight seem funny, though they reflect bigger issues existing in organizations: people don’t always understand the data they work with, how data are brought together as part of the processes they support, respectively how data can be used to manage and optimize the respective processes. Moreover, occasionally people torture the data until it confesses something that doesn’t necessarily reflect the reality. It’s even more deplorable when the conclusions are used for decision-making, managing or optimizing the process. In extremis, the result is an iterative process that creates more and bigger issues than whose it was supposed to solve!

Behind each blunder there are probably bigger understanding issues that need to be addressed. Many of the issues revolve around understanding how data are created, how are brought together, how the processes work and what data they need, use and generate. Moreover, few business and IT people look at the full lifecycle of data and try to optimize it, or they optimize it in the wrong direction. Data Management is supposed to help, and it does this occasionally, though a methodology, its processes and practices are as good as people’s understanding about data and its use! No matter how good a data methodology is, it’s as weak as the weakest link in its use, and typically the issues revolving around data and data understanding are the weakest link. 

Besides technical people, few businesspeople understand the full extent of managing data and its lifecycle. Unfortunately, even if some of the topics are treated in the books, they are too dry, need hands on experience and some thought in corroborating practices with theories. Without this, people will do things mechanically, processes being as good as the people using them, their value becoming suboptimal and hinder the business. That’s why training on Data Management is not enough without some hands-on experience!

The most important impact is however in BI/Data Analytics areas - how the various artifacts are created and used as support in decision-making, process optimization and other activities rooted in data. Ideally, some KPIs and other metrics should be enough for managing and directing a business, however just basing the decisions on a set of KPIs without understanding the bigger picture, without having a feeling of the data and their quality, the whole architecture, no matter how splendid, can breakdown as sandcastle on a shore meeting the first powerful wave!

Sometimes it feels like organizations do things from inertia, driven by the forces of the moment, initiatives and business issues for which temporary and later permanent solutions are needed. The best chance for solving many of the issues would have been a long time ago, when the issues were still small to create any powerful waves within the organizations. Therefore, a lot of effort is sometimes spent in solving the consequences of decisions not made at the right time, and that can be painful and costly!

For building a good business one needs also a solid foundation. In the past it was enough to have a good set of products that are profitable. However, during the past decade(s) the rules of the game changed driven by the acerb competition across geographies, inefficiencies, especially in the data and process areas, costing organizations on the short and long term. Data Management in general and Data Quality in particular, even if they’re challenging to quantify, have the power to address by design many of the issues existing in organizations, if given the right chance!

Previous Post <<||>> Next Post

02 September 2024

🗄️Data Management: Data Culture (Part III: A Tale of Two Cities)


One of the curious things is that as part of their change of culture organizations try to adopt a new language, to give new names to things, try to make distinction between the "AS IS" and "TO BE" states, insisting how the new image will replace the previous one. Occasionally, they even stress how bad things were in the past and how great will be in the future, trying to depict the future in vivid images. 

Even if this might work occasionally, it tends to confuse people and this not necessarily because of the language and the metaphors used, or the fact that same people were in the same positions, but the lack of belief or conviction, respectively half-hearted enthusiasm personified by the parties. To "convert" people to new philosophies one needs to believe in them or mimic that in similar terms. The lack of conviction can easily have a false effect that spreads within the organization. 

Dissociation from the past, from what an organization was, tends to increase the resistance against the new because two different images are involved. On one side there’s the attachment to the past, and even if there were mistakes made, or things didn’t go optimally, the experiences and decisions made are part of the organization, of the people who made them. People as individuals and as an organization should embrace their mistakes and good deeds altogether, learn from them, improve what is to improve and move forward. Conversely, there’s the resistance to the new, to the change, words they don’t believe in yet, the bigger picture is still fuzzy in their minds, and there can be many other reasons that don’t agree with one’s understanding. 

There are images, memories, views, decisions, objectives of the past and people need to recognize the road from what it was to what should be. One can hypothesize that embracing one’s mistake and understanding, the chain of reasoning from then and from now will help an organization transition towards the new. Awareness of one’s situation most probably will help in the transition process. Unfortunately, leaders and technology gurus tend to depict the past as negative, creating thus more negative emotions, respectively reactions in the process. The past is still part of the people, of the organization and will continue to be.

Conversely, the disassociation from the past can create more resistance to the new, and probably more unnecessary barriers. Probably, it’s easier for the gurus to build the new if the past weren’t there! Forgetting the past would be an error because there are many lessons that can be still useful. All the experience needs to be redirected in new directions. It’s more important to help people see the vision of the future, understand their missions, the paths to be followed and the challenges ahead, . 

It sounds more of a rambling from a psychology course, though organizations do have an image they want to change, to bring forth to cope with the various challenges, an image they want to reflect when needed. There are also organizations that want to change but keep their image intact, which leads to deeper conflicts. Unfortunately, changes of image involve conflicts that can become complex from what they bring forth.  

A data culture should increase people’s awareness of the present, respectively of the future, of what it takes to bridge the gap, the challenges ahead, how to embrace change, how to keep a realistic perspective, how to do a reality check, etc. Methodologies can increase people’s awareness and provide the theoretical basis, though walking the path will be a different story for everyone. 

01 September 2024

🗄️Data Management: Data Governance (Part I: No Guild of Heroes)

Data Management Series
Data Management Series

Data governance appeared around 1980s as topic though it gained popularity in early 2000s [1]. Twenty years later, organizations still miss the mark, respectively fail to understand and implement it in a consistent manner. As usual, the reasons for failure are multiple and they vary from misunderstanding what governance is all about to poor implementation of methodologies and inadequate management or leadership. 

Moreover, methodologies tend to idealize the various aspects and is not what organizations need, but pragmatism. For example, data governance is not about heroes and heroism [2], which can give the impression that heroic actions are involved and is not the case! Actions for the sake of action don’t necessarily lead to change by themselves. Organizations are in general good at creating meaningless action without results, especially when people preoccupy themselves, miss or ignore the mark. Big organizations are very good at generating actions without effects. 

People do talk to each other, though they try to solve their own problems and optimize their own areas without necessarily thinking about the bigger picture. The problem is not necessarily communication or the lack of depth into business issues, people do communicate, know the issues without a business impact assessment. The challenge is usually in convincing the upper management that the effort needs to be consolidated, supported, respectively the needed resources made available. 

Probably, one of the issues with data governance is the attempt of creating another structure in the organization focused on quality, which has the chances to fail, and unfortunately does fail. Many issues appear when the structure gains weight and it becomes a separate entity instead of being the backbone of organizations. 

As soon organizations separate the data governance from the key users, management and the other important decisional people in the organization, it takes a life of its own that has the chances to diverge from the initial construct. Then, organizations need "alignment" and probably other big words to coordinate the effort. Also such constructs can work but they are suboptimal because the forces will always pull in different directions.

Making each manager and the upper management responsible for governance is probably the way to go, though they’ll need the time for it. In theory, this can be achieved when many of the issues are solved at the lower level, when automation and further aspects allow them to supervise things, rather than hiding behind every issue. 

When too much mircomanagement is involved, people tend to busy themselves with topics rather than solve the issues they are confronted with. The actual actors need to be empowered to take decisions and optimize their work when needed. Kaizen, the philosophy of continuous improvement, proved itself that it works when applied correctly. They’ll need the knowledge, skills, time and support to do it though. One of the dangers is however that this becomes a full-time responsibility, which tends to create a separate entity again.

The challenge for organizations lies probably in the friction between where they are and what they must do to move forward toward the various objectives. Moving in small rapid steps is probably the way to go, though each person must be aware when something doesn’t work as expected and react. That’s probably the most important aspect. 

So, the more functions are created that diverge from the actual organization, the higher the chances for failure. Unfortunately, failure is visible in the later phases, and thus self-awareness, self-control and other similar “qualities” are needed, like small actors that keep the system in check and react whenever is needed. Ideally, the employees are the best resources to react whenever something doesn’t work as per design. 

Previous Post <<||>> Next Post 

Resources:
[1] Wikipedia (2023) Data Management [link]
[2] Tiankai Feng (2023) How to Turn Your Data Team Into Governance Heroes [link]


03 October 2023

🧮ERP: Implementations (Part III: Simplifying the Implementation Project)

 

ERP Implementation

ERP implementations are complex projects and a way to manage their complexity is to attempt reducing their complexity (instead of answering to complexity by complexity). A project implementation’s methodology is probably the most important area that allows project’s simplification, though none of the available methodologies seems to work well with such projects.

The point that differentiates the various methodologies is solution’s conceptualization. In general, the expectation is to have a set of functional design documents (FDDs) that describe how the system operates and that can be used for programming the customizations, if any. The customer must review and sign-off the FDDs before the setup is done, respectively the development starts. Moreover, given the dependencies between documents, they often need to be signed off together.

Unfortunately, FDDs reflect the degree of understanding of the target system and business requirements, gaps that can prove to be a challenge for the parties involved, requiring many iterations until they are brought to the expected quality level. The higher the accuracy considered; the more iterations are needed. FDDs tend to consume a considerable percent of the available financial resources, in extremis the whole budget being exhausted just for 'printed paper'. Moreover, the key users see late in the project the working functionality.

In agile methodologies, FDDs are replaced by user stories, and, if still needed, can be written as part of the sprints or later. Unfortunately, agile methodologies have their own challenges and constraints in ERP implementations. As functionality is explored, understood, and negotiated with the customer during the implementation, it’s seldom possible to provide a realistic cost estimation upfront. Given that most ERP implementations exceed their budget, starting a journey without having an idea how much the project costs seems to be a prohibitive approach for many customers. Moreover, the negotiations have the character of Change Requests, which can easily become a bottleneck for the project.

On the other hand, agile methodologies involve the customer earlier and the development could start earlier as well. The earlier the customer is involved, the earlier the key users understand how the system works, and thus they can be more efficient in performing their activities, respectively in identifying the gaps in understanding, trapping functional issues early in the process, at least in theory. Some projects address this need by having the key user trained, though the training environment usually has a different setup and data than needed by the customer. Wouldn’t be a good idea to have the key users trained in an environment that reflects to a higher or lower degree the customer’s data and setup requirements?

In theory the setup for such an environment can be done upfront based on one standard configuration frequently met in customer’s industry. With this the functional consultants can start to configure the system together with the key users exploring the data and setup existing in the legacy system(s). This would allow increasing on both sides the depth of understanding and has the potential of speeding up the implementation. This can be started in the early phases, during the time in which the requirements are gathered. Ideally, a basic setup can exist already when the requirements are signed off. It’s true that this approach would mean a higher investment upfront, though the impact could be considerable. Excepting Data Migration and customizations the customer already has a good basis for Go-Live.

Of course, there can be further challenges, though the customer can make thus sure that the financial resources are well spent – having a usable system, respectively a good system understanding outweighs by far the extreme alternative of having high-quality unimplemented FDDs!

Previous <<||>> Next

07 March 2021

💼Project Management: Methodologies (Part I: Agile Manifesto Reloaded I - An Introduction)

 

Project Management

There are so many books written on agile methodologies, each attempting to depict the realities of software development projects. There are many truths considered in them, though they seem to blend in a complex texture in which the writer takes usually the position of a preacher in which the sins of the traditional technologies are contrasted with the agile principles. In extremis everything done in the past seems to be wrong, while the agile methods seem to be a panacea, which is seldom the case.

There are already 20 years since the agile manifesto was published and the methodologies adhering to the respective principles don’t seem to provide the expected success, suffering from the same chronical symptoms of their predecessors - they are poorly understood and implemented, tend to function after hammer’s principle, respectively the software development projects still deliver poor results. Moreover, there are more and more professionals who raise their voice against agile practices.

Frankly, the principles behind the agile manifesto make sense. A project should by definition satisfy stakeholders’ requirements, ideally through regular deliveries that incorporate the needed functionality while gradually seeking to get early feedback from customers, respectively involve the customer through all project’s duration, working together to deliver a feasible product. Moreover, self-organizing teams, face-to-face meetings, constant pace, technical excellence should allow minimizing the waste, respectively maximizing the efficiency in the project. Further aspects like simplicity, good design and architecture should establish a basis for success.

Re-reading the agile manifesto, even if each read pulls from experience more and more pro and cons, the manifesto continues to look like a Christmas wish-list. Even if the represented ideas make sense and satisfy a specific need, they are difficult to achieve in a project’s context and setup. Each wish introduces a constraint that brings with it its own limitations. Unfortunately, each policy introduced by a methodology follows the same pattern, no matter of the methodology considered. Moreover, the wishes cover only a small subset from a project’s texture, are general and let lot of space for interpretation and implementation, though the same can be said about any principles that don’t provide a coherent worldview or a conceptual model.

The software development industry needs a coherent worldview that reflects its assumptions, models, characteristics, laws and challenges. Software Engineering (SE) attempts providing such a worldview though unfortunately is too complex for many and there seem to be a big divide when considered in respect to the worldviews introduced by the various Project Management (PM) methodologies. Studying one or two PM methodologies, learning a few programming languages and even the hand on experience on a few projects won’t fill the gaps in knowledge associated with the SE worldview.

Organizations don’t seem to see the need for professionals of having a formal education in SE. On the other side is expected from employees to have by default some of the skillset required, which is not the case. Besides understanding and implementing a technology there are a set of knowledge areas in which the IT professional must have at least a high-level knowledge if it’s expected from him/her to think critically about the respective areas. Unfortunately, the lack of such knowledge leads sometimes to situations which can impact negatively projects.

Almost each important word from the agile manifesto pulls with it a set of concepts from a SE’ worldview – customer satisfaction, software delivery, working software, requirements management, change management, cooperation, teamwork, trust, motivation, communication, metrics, stakeholders’ management, good design, good architecture, lessons learned, performance management, etc. The manifesto needs to be regarded from a SE’s eyeglasses if one expects value from it.

Previous Post <<||>>  Next Post

04 March 2021

💼Project Management: Project Execution (Part III: Projects' Dynamics - An Introduction)

Despite the considerable collection of books on Project Management (PM) and related methodologies, and the fact that projects are inherent endeavors in professional as well personal life (setups that would give in theory people the environment and exposure to different project types), people’s understanding on what it takes to plan and execute a project seems to be narrow and questionable sometimes. Moreover, their understanding diverges considerably from common sense. It’s also true that knowledge and common sense are relative when considering any human endeavor in which there are multiple roads to the same destination, or when learning requires time, effort, skills, and implies certain prerequisites, however the lack of such knowledge can hurt when endeavor’s success is a must and a team effort. 

Even if the lack of understanding about PM can be considered as minor when compared with other challenges/problems faced by a project, when one’s running fast to finish a race, even a small pebble in one’s running shoes can hurt a lot, especially when one doesn’t have the luxury to stop and remove the stone, as it would make sense to do.

It resides in the human nature to resist change, to seek for information that only confirm own opinions, to follow the same approach in handling challenges, even if the attempts are far from optimal, even if people who walked the same path tell you that there’s a better way and even sketch the path and provide information about what it takes to reach there. As it seems, there’s the predisposition to learn on the hard way, if there’s significant learning involved at all. Unfortunately, such situations occur in projects and the solutions often overrun the boundaries of PM, where social and communication skills must be brought into play. 

On the other side, there’s still hope that change can be managed optimally once the facts are explained to a certain level that facilitates understanding. However, such an attempt can prove to be quite a challenge, given the various setups in which PM takes place. The intersection between technologies and organizational setups lead to complex scenarios which make such work more difficult, even if projects’ challenges are of organizational rather than technological nature. 

When the knowledge we have about the world doesn’t fit our expectation, a simple heuristic is to return to the basics. A solid edifice can be built only on a solid foundation and the best foundation in coping with reality is to establish common ground with other people. One can achieve this by identifying their suppositions and expectations, by closing the gap in perception and understanding, by establishing a basis for communication, in which feedback is a must if one wants to make significant progress.

Despite of being explorative and time-consuming, establishing common ground can be challenging when addressing to an imaginary audience, which is quite often the situation. The practice shows however that progress can be made by starting with a set of well-formulated definitions, simple models, principles, and heuristics that have the potential of helping in sense-making.

The goal is thus to identify first the definitions that reflect the basic concepts that need to be considered. Once the concepts defined, they can be related to each other with the help of a few models. Even if fictitious, as simplifications of the reality, the models should allow playing with the concepts, facilitating concepts’ understanding. Principles (set of rules for reasoning) can be used together with heuristics (rules of thumb methods or techniques) for explaining the ‘known’ and approaching the ‘unknown’. Even maybe not perfect, these tools can help building theories or explanatory constructs.

||>>Next Post

30 October 2020

Data Science: Data Strategy (Part II: Generalists vs Specialists in the Field)

Data Science

Division of labor favorizes the tasks done repeatedly, where knowledge of the broader processes is not needed, where aspects as creativity are needed only at a small scale. Division invaded the IT domains as tools, methodologies and demands increased in complexity, and therefore Data Science and BI/Analytics make no exception from this.

The scale of this development gains sometimes humorous expectations or misbelieves when one hears headhunters asking potential candidates whether they are upfront or backend experts when a good understanding of both aspects is needed for providing adequate results. The development gains tragicomical implications when one is limited in action only to a given area despite the extended expertise, or when a generalist seems to step on the feet of specialists, sometimes from the right entitled reasons. 

Headhunters’ behavior is rooted maybe in the poor understanding of the domain of expertise and implications of the job descriptions. It’s hard to understand how people sustain of having knowledge about a domain just because they heard the words flying around and got some glimpse of the connotations associated with the words. Unfortunately, this is extended to management and further in the business environment, with all the implications deriving from it. 

As Data Science finds itself at the intersection between Artificial Intelligence, Data Mining, Machine Learning, Neurocomputing, Pattern Recognition, Statistics and Data Processing, the center of gravity is hard to determine. One way of dealing with the unknown is requiring candidates to have a few years of trackable experience in the respective fields or in the use of a few tools considered as important in the respective domains. Of course, the usage of tools and techniques is important, though it’s a big difference between using a tool and understanding the how, when, why, where, in which ways and by what means a tool can be used effectively to create value. This can be gained only when one’s exposed to different business scenarios across industries and is a tough thing to demand from a profession found in its baby steps. 

Moreover, being a good data scientist involves having a deep insight into the businesses, being able to understand data and the demands associated with data – the various qualitative and quantitative aspects. Seeing the big picture is important in defining, approaching and solving problems. The more one is exposed to different techniques and business scenarios, with right understanding and some problem-solving skillset one can transpose and solve problems across domains. However, the generalist will find his limitations as soon a certain depth is reached, and the collaboration with a specialist is then required. A good collaboration between generalists and specialists is important in complex projects which overreach the boundaries of one person’s knowledge and skillset. 

Complexity is addressed when one can focus on the important characteristic of the problem, respectively when the models built can reflect the demands. The most important skillset besides the use of technical tools is the ability to model problems and root the respective problems into data, to elaborate theories and check them against reality. 

Complex problems can require specialization in certain fields, though seldom one problem is dependent only on one aspect of the business, as problems occur in overreaching contexts that span sometimes the borders of an organization. In addition, the ability to solve problems seem to be impacted by the diversity of the people involved into the task, sometimes even with backgrounds not directly related to organization’s activity. As in evolution, a team’s diversity is an important factor in achievement and learning, most gain being obtained when knowledge gets shared and harnessed beyond the borders of teams.

Note:
Written as answer to a Medium post on Data Science generalists vs specialists.

16 June 2020

💼Project Management: Project Planning (Part IV: Planning Correctly Misunderstood IV)

Mismanagement

The relatively big number of Project Management (PM) methodologies considered nowadays makes it more and more difficult to understand the world of PM and make oneself understood, in the context in which terminology is used in explanations that defy the logic, in which people are stubborn in persisting that their understanding is the ultimate truth and, that between white and black there are no degrees of gray. Between all PM concepts project planning seems to be the most misunderstood, and this probably because all the activities revolve around it, while each methodology brings its own planning philosophy. Each methodology comes with its own story, its own imaginative description of what a perfect plan is about.

Independently of the methodology used there are three levels of planning. At highest level, the strategic one, the project is put in the context of other strategic activities – other projects and initiatives, as well business operations, competing altogether for the same financial and human resources.  At this level are the goals identified and put the basis for the successful execution of the project, including establishing the ground and integrating the main aspects of a project – risk, quality and communication. Here is decided which projects will be considered, in which sequence, how and when resources will be assigned. 

A project plan is typically written and further executed by having the tactical horizon in mind – the individual engagement of resources and actions, the actual means to reach the objectives set at strategic level. It’s the level where the actual project plan is detailed, where activities are sequenced and prioritized. Here each methodology has its own approach – whether the planning is done per deliverable, work package or any other approach used to partition the activities. It’s the level at which the various teams are coordinated toward specific targets. Thus the manageable unit is the team and not the individual, the deliverables or the work packages and not the individual tasks.

The operational level equates with the execution of a project’s activities. Even if the project manager oversights the activities, it’s in team’s duties to plan the activities having the set deliveries in mind. The project manager doesn’t need to know all the details, though he should be updated on a timely manner on the progress, the eventual risks and opportunities that arise in each area. This requires continuous coordination on vertical as well horizontal level.

The project manager typically oscillates between the strategic and tactical views of a project, while the operational level appears in the view only when operational themes are escalated or further coordination is needed. Even if this delimitation is clear in big projects, in the small projects the three levels melt into each other. Therefore the sprung from small to big projects and vice-versa can create issues when the approach is not tailored to project’s size and its further characteristics.

Attempting to plan each activity in the project at the lowest level of detail obscures the view, the complexity of the project kicking back sooner or later. Maintaining such a detailed plan can become a waste of time on the long term. In extremis a resource is used to update a plan, which easily can become obsolete by the time all activities were reviewed. This doesn’t mean that the project plan doesn’t need to be updated regularly, though the pace can be decided on each project’s specifics.

Therefore, one of the most important challenges in projects is finding the appropriate level of detail for planning, and there’s no general rule that works for all projects. Typically the choices alternate between work packages and deliverables. 

21 May 2020

💼Project Management: Project Planning (Part III: Planning Correctly Misunderstood III)

Mismanagement

One of the most misunderstood topics in Project Management seems to be the one of planning, and this probably because everyone has a good idea of what it means to plan an activity – we do it daily and most of the times (hopefully) we hit a bull’s-eye (or we have the impression we did that). You must do this and that, you have that dependency, you must coordinate with a few people, you must first reach that milestone before going further, you do one step at a time, and so on. It’s pretty easy, isn’t it?

From a bird’s eyes view project planning is like planning every other activity though there are several important differences. The most important one is of scale – the number of activities and resources involved, the level of coordination and communication, as well the quality with which occur, the level of uncertainty and control, respectively manageability. All these create a complexity that is hardly manageable by just one person. 

Another difference is the detail needed for the planning and targets’ reachability. Some believe that the plan needs to be done down to the lowest level of detail, which even if possible can prove to be an impediment to planning. Projects’ environment share some important characteristics with a battle field in terms of complexity of interactions, their dynamics and logistical requirements. Within an army’s structure there are levels of organization that require different mindsets and levels of planning. A general thinks primarily at strategic level in which troops and actions are seen as aggregations at the needed level of abstraction that makes their organization and planning manageable. The strategy is done however in collaboration with other generals and upper structures, while having defined the strategic goals the general must devise together with the immediate subalterns the tactics. In theory the project manager must regard the project from the same perspective. Results thus three levels of planning – strategic, done with the upper management, tactical done with the team members, respectively logistical, done within the team. That’s a way of breaking the complexity and dividing the responsibilities within the project. 

Projects’ final destination seem to have the character of a wish list more or less anchored in reality. From a technical point the target can be achievable though in big projects the most important challenges are of organizational nature – of being able to allocate and coordinate effectively the resources as needed by the project. The wish-like character is reflected also by the cost, scope, time triangle in respect to the expected quality – to some point in time one is forced to choose between two of them. On the other side, there’s the tendency to see the targets and milestones as fixed, with little room for deviation. One can easily forget that a strategic plan’s purpose is to set the objectives, identify the challenges and the possible lines of action, while a tactical plan’s objective is to devise the means to reach the objectives. Bringing everything together can easily obscure the view and, in extremis, the plan loses its actuality as soon was created (and approved). 

The most confusing aspect is probably the adherence of a plan to a given methodology, one dicing a project and thus a plan to fit a methodology by following blindly the rules and principles imposed by it instead of fitting the methodology to a project. Besides the fact that the methodologies are best practices but not necessarily good practices, what fits for an organization, they tend to be either too general, by specifying the what and not the how, or too restrictive (interpreted). 

30 January 2020

💼☯Project Management: Methodologies (The Good, the Bad and the Ugly)

Mismanagement

The Good
: Nowadays there're several Project Management (PM) methodologies to choose from to address a project’s specifics and, when adapted and applied accordingly, a methodology can enable projects to be run and brought under control.

The Bad: Even if the theoretical basis of PM methodologies has been proved and perfected over the years, projects continue to fail at a disturbing rate. Of course, the reasons behind their failure are multiple, though often the failure reasons are rooted in how PM methodologies are taught, understood and implemented.

Same as a theoretical course in cooking won’t make one a good cook, a theoretical course in PM won’t make one a good Project Manager or knowledgeable team member in applying the learned methodology. Surprisingly, the expectation is exactly that – the team member got a training and is good to go. Moreover, people believe that managing a software project is like coordinating the building of a small treehouse. To some degree there are many similarities though the challenges typically lie in details, and these details often escape a standard course.

To bridge the gap between theory and practice is needed time for the learner to grow in the role, to learn the does and don’ts, and, most important, to learn how to use the tools at hand efficiently. The methodology is itself a tool making use of further tools in its processes – project plans, work breakdown structures, checklists, charters, reports, records, etc. These can be learned only through practice, hopefully with some help (aka mentoring) from an experienced person in the respective methodology, either the Project Manager itself, a trainer or other team member. Same as one can’t be thrown into the water and expected to traverse the Channel Tunnel, you can’t do that with a newbie.

There’s a natural fallacy to think that we've understood more than we have. We can observe our understanding's limits when we are confronted with the complexities involved in handing PM activities. A second fallacy is not believing other people’s warnings against using a tool or performing an activity in a certain way. A newbie’s mind has sometimes the predisposition of a child to try touching a hot stove even if warned against it. It’s part of the learning process, though some persist in such behavior without learning much. What’s even more dangerous is a newbie pretending to be an expert and this almost always ends badly.

The Ugly appears when the bad is brought to extreme, when methodologies are misused for the wrong purposes to the degree that they destroy anything in their way. Of course, a pool can be dug by using a spoon but does it make sense to do that? Just because a tool can be used for something it doesn’t mean it should be used for it as long there are better tools for the same. It seems a pretty logical thing though the contrary happens more often than we’d like. It starts with the preconception that one should use the tool one knows best, ignoring in the process the fit for purpose condition. What’s even more deplorable is breaking down a project to fit a methodology while ignoring the technical and logistical aspects.

Any tool can lead to damages when used excessively, in wrong places, at the wrong point in time or by the wrong person. Like the instruments in an orchestra, when an instrument plays the wrong note, it dissonates from the rest. When more instruments play wrongly, then the piece is unrecognizable. It’s the role of the bandmaster to make the players touch the right notes at the right time.

29 July 2019

💻IT: Best Practices (Definitions)

"A preferred and repeatable action or set of actions completed to fulfill a specific requirement or set of requirements during the phases within a product-development process." (Clyde M Creveling, "Six Sigma for Technical Processes: An Overview for R Executives, Technical Leaders, and Engineering Managers", 2006)

"A process or method that is generally recognized to produce superior results. The application of these should result in a positive, measurable change." (Tilak Mitra et al, "SOA Governance", 2008)

"A technique or methodology that, through past experience and research, has proven to reliably lead to a desired result. A commitment to using the best practices in any field (for example, in the domain of IT Architecture) ensures leveraging past experience and all of the knowledge and technology at one’s disposal to ensure success." (Allen Dreibelbis et al, "Enterprise Master Data Management", 2008)

"An effective way of doing something. It can relate to anything from writing program code to IT governance." (Judith Hurwitz et al, "Service Oriented Architecture For Dummies" 2nd Ed., 2009)

"A best practice is commonly understood to be a well-proven, repeatable, and established technique, method, tool, process, or activity that is more certain in delivering the desired results. This indicates that a best practice typically has been used by a large number of people or organizations and/or over a long time, with significant results that are clearly superior over other practices. Knowledge patterns can be used to formalize the description of a best practice." (Jörg Rech et al, "Knowledge Patterns" [in "Encyclopedia of Knowledge Management" 2nd Ed.], 2011)

"A specific method that improves the performance of a team or an organization and can be replicated or adapted elsewhere. Best practices often take the form of guidelines, principles, or ideas that are endorsed by a person or governing body that attests to the viability of the best practice." (Gina Abudi & Brandon Toropov, "The Complete Idiot's Guide to Best Practices for Small Business", 2011)

"A technique, method, process, discipline, incentive, or reward generally considered to be more effective at delivering a particular outcome than by other means." (Craig S Mullins, "Database Administration", 2012)

"In general, Best Practices refer to the methods, currently recognized within a given industry or discipline, to achieve a stated goal or objective. In the OPM3 context, Best Practices are achieved when an organization demonstrates consistent organizational project management processes evidenced by successful outcomes." (Project Management Institute, "Organizational Project Management Maturity Model (OPM3)" 3rd Ed, 2013)

"An effective way of doing something. It can relate to anything from writing program code to IT governance." (Marcia Kaufman et al, "Big Data For Dummies", 2013)

"Those methods, processes, or procedures that have been proven to be the most effective, based on real-world experience and measured results." (Robert F Smallwood, "Information Governance: Concepts, Strategies, and Best Practices", 2014)

"Best practices are defined as commercial or professional procedures that are accepted or prescribed as being effective most of the time. It can also be considered a heuristic, in that is a rule of thumb that generally succeeds but is not guaranteed to always work in every instance." (Michael Winburn & Aaron Wheeler, "Cloud Storage Security", 2015)

"A 'benchmarking' approach where organisations determine who the leader in a particular practice is and then copy that approach. Useful for achieving efficiencies but may diminish differentiation if not used with caution at the strategic level." (Duncan Angwin & Stephen Cummings, "The Strategy Pathfinder" 3rd Ed., 2017)

"A proven activity or process that has been successfully used by multiple enterprises." (ISACA) 

"A superior method or innovative practice that contributes to the improved performance of an organization, usually recognized as best by other peer organizations." (American Society for Quality)

15 July 2019

💻IT: Search Engine Optimization [SEO] (Definitions)

"The set of techniques and methodologies devoted to improving organic search rankings (not paid search) for a Web site." (Mike Moran & Bill Hunt , "Search Engine Marketing, Inc", 2005)

"The process and strategy of presenting a business on the web to improve the ability of potential customers finding it through natural searches on search engines such as Google, Yahoo!, and Bing." (Gina Abudi & Brandon Toropov, "The Complete Idiot's Guide to Best Practices for Small Business", 2011)

"The process of improving the volume or quality of traffic to a Web site from search engines via unpaid search results." (Linda Volonino & Efraim Turban, "Information Technology for Management 8th Ed", 2011)

"techniques to help ensure that a web site appears as close to the first position on a web search results page as possible." (Bill Holtsnider & Brian D Jaffe, "IT Manager's Handbook" 3rd Ed., 2012)

"Search engine optimization, the set of techniques and methodologies devoted to improving organic search rankings (not paid search) for a Web site." (Mike Moran & Bill Hunt , "Search Engine Marketing, Inc", 2005)

"The process of writing web content so as to increase a page's ranking in online search results." (Faithe Wempen, "Computing Fundamentals: Introduction to Computers", 2015)

"its main function is to increase website visibility. The main search engines use algorithms to rank a website’s position and hence its overall position in the search results. In some instances it can be as simple as structuring the words on a website in a way the search engine operates. " (BCS Learning & Development Limited, "CEdMA Europe", 2019)

15 May 2019

#️⃣Software Engineering: Programming (Part XIV: Rapid Prototyping - Introduction)

Software Engineering
Software Engineering Series

Rapid (software) prototyping (RSP) is a group of techniques applied in Software Engineering to quickly build a prototype (aka mockup, wireframe) to verify the technical or factual realization and feasibility of an application architecture, process or business model. A similar notion is the one of Proof-of-Concept (PoC), which attempts to demonstrate by building a prototype, starting an experiment or a pilot project that a technical concept, business proposal or theory has practical potential. In other words in Software Engineering a RSP encompasses the techniques by which a PoC is lead.

In industries that consider physical products a prototype is typically a small-scale object made from inexpensive material that resembles the final product to a certain degree, some characteristics, details or features being completely ignored (e.g. the inner design, some components, the finishing, etc.). Building several prototypes is much easier and cheaper than building the end product, they allowing to play with a concept or idea until it gets close to the final product. Moreover, this approach reduces the risk of ending up with a product nobody wants.

A similar approach and reasoning is used in Software Engineering as well. Building a prototype allows focusing at the beginning on the essential characteristics or aspects of the application, process or (business) model under consideration. Upon case one can focus on the user interface (UI) , database access, integration mechanism or any other feature that involves a challenge. As in the case of the UI one can build several prototypes that demonstrate different designs or architectures. The initial prototype can go through a series of transformations until it reaches the desired form, following then to integrate more functionality and refine the end product gradually. This iterative and incremental approach is known as rapid evolutional prototyping.

A prototype is useful especially when dealing with the uncertainty, e.g. when adopting (new) technologies or methodologies, when mixing technologies within an architecture, when the details of the implementation are not known, when exploring an idea, when the requirements are expected to change often, etc. Building rapidly a prototype allows validating the requirements, responding agilely to change, getting customers’ feedback and sign-off as early as possible, showing them what’s possible, how the future application can look like, and this without investing too much effort. It’s easier to change a design or an architecture in the concept and design phases than later.

In BI prototyping resumes usually in building queries to identify the source of the data, reengineer the logic from the business application, prove whether the logic is technically feasible, feasibility being translate in robustness, performance, flexibility. In projects that have a broader scope one can attempt building the needed infrastructure for several reports, to make sure that the main requirements are met. Similarly, one can use prototyping to build a data warehouse or a data migration layer. Thus, one can build all or most of the logic for one or two entities, resolving the challenges for them, and once the challenges solved one can go ahead and integrate gradually the other entities.

Rapid prototyping can be used also in the implementation of a strategy or management system to prove the concepts behind. One can start thus with a narrow focus and integrate more functions, processes and business segments gradually in iterative and incremental steps, each step allowing to integrate the lesson learned, address the risks and opportunities, check the progress and change the direction as needed.

Rapid prototyping can prove to be a useful tool when given the chance to prove its benefits. Through its iterative and incremental approaches it allows to reach the targets efficiently



13 May 2019

#️⃣Software Engineering: Programming (Part XIII: Good Programmer, Bad Programmer)

Software Engineering
Software Engineering Series

The use of denominations like 'good' or 'bad' related to programmers and programming carries with it a thin separation between these two perceptional poles that represent the end results of the programming process, reflecting the quality of the code delivered, respectively the quality of a programmer’s effort and  behavior as a whole. This means that the usage of the two denominations is often contextual, 'good' and 'bad' being moving points on a imaginary value scale with a wide range of values within and outside the interval determined by the two.

The 'good programmer' label is a idealization of the traits associated with being a programmer – analyzing and understanding the requirements, filling the gaps when necessary, translating the requirements in robust designs, developing quality code with a minimum of overwork, delivering on-time, being able to help others, to work as part of a (self-organizing) team and alone, when the project requires it, to follow methodologies, processes or best practices, etc. The problem with such a definition is that there's no fix limit, considering that programmer’s job description can include an extensive range of requirements.

The 'bad programmer' label is used in general when programmers (repeatedly) fail to reach others’ expectations, occasionally the labeling being done independently of one’s experience in the field. The volume of bugs and mistakes, the fuzziness of designs and of the code written, the lack of comments and documentation, the lack of adherence to methodologies, processes, best practices and naming conventions are often considered as indicators for such labels. Sometimes even the smallest mistakes or the wrong perceptions of one’s effort and abilities can trigger such labels.

Labeling people as 'good' or 'bad' has the tendency of reinforcing one's initial perception, in extremis leading to self-fulfilling prophecies - predictions that directly or indirectly cause themselves to become true, by the very terms on how the predictions came into being. Thus, when somebody labels another as 'good' or 'bad' he more likely will look for signs that reinforce his previous believes. This leads to situations in which "good" programmers’ mistakes are easier overlooked than 'bad' programmers' mistakes, even if the mistakes are similar.

A good label can in theory motivate, while a bad label can easily demotivate, though their effects depend from person to person. Such labels can easily become a problem for beginners, because they can easily affect beginners' perception about themselves. It’s so easy to forget that programming is a continuous learning process in which knowledge is relative and highly contextual, each person having strengths and weaknesses.

Each programmer has a particular set of skills that differentiate him from other programmers. Each programmer is unique, aspect reflected in the code one writes. Expecting programmers to fit an ideal pattern is unrealistic. Instead of using labels one should attempt to strengthen the weaknesses and make adequate use of a person’s strengths. In this approach resides the seeds for personal growth and excellence.

There are also programmers who excel in certain areas - conceptual creativity, ability in problem identification, analysis and solving, speed, ingenuity of design and of making best use of the available tools, etc. Such programmers, as Randall Stross formulates it, “are an order of magnitude better” than others. The experience and skills harnessed with intelligence have this transformational power that is achievable by each programmer in time.

Even if we can’t always avoid such labeling, it’s important to become aware of the latent force the labels carry with them, the effect they have on our colleagues and teammates. A label can easily act as a boomerang, hitting us back long after it was thrown away.



12 May 2019

#️⃣Software Engineering: Programming (Part XI: Misconceptions about Programming - Part I)

Software Engineering
Software Engineering Series

Besides equating the programming process with a programmer’s capabilities, minimizing the importance of programming and programmers’ skills in the whole process (see previous post), there are several other misconceptions about programming that influence process' outcomes.


Having a deep knowledge of a programming language allows programmers to easily approach other programming languages, however each language has its own learning curve ranging from a few weeks to half of year or more. The learning curve is dependent on the complexity of the languages known and the language to be learned, same applying to frameworks and architectures, the scenarios in which the languages are used, etc. One unrealistic expectation is that the programmers are capablle of learning a new programming language or framework overnight, this expectation pushing more pressure on programmers’ shoulders as they need to compensate in a short time for the knowledge gap. No, the programming languages are not the same even if there’s high resemblance between them!

There’s lot of code available online, many of the programming tasks involve writing similar code. This makes people assume that programming can resume to copy-paste activities and, in extremis, that there’s no creativity into the act of programming. Beside the fact that using others’ code comes with certain copyright limitations, copy-pasting code is in general a way of introducing bugs in software. One can learn a lot from others’ code, though programmers' challenge resides in writing better code, in reusing code while finding the right the level of abstraction.  
 
There’s the tendency on the market to build whole applications using wizard-like functionality and of generating source-code based on data or ontological models. Such approaches work in a range of (limited) scenarios, and even if the trend is to automate as much in the process, is not what programming is about. Each such tool comes with its own limitations that sooner or later will push back. Changing the code in order to build new functionality or to optimize the code is often not a feasible solution as it imposes further limitations.

Programming is not only about writing code. It involves also problem-solving abilities, having a certain understanding about the business processes, in which the conceptual creativity and ingenuity of design can prove to be a good asset. Modelling and implementing processes help programmers gain a unique perspective within a business.

For a programmer the learning process never stops. The release cycle for the known tools becomes smaller, each release bringing a new set of functionalities. Moreover, there are always new frameworks, environments, architectures and methodologies to learn. There’s a considerable amount of effort in expanding one's (necessary) knowledge, effort usually not planned in projects or outside of them. Trainings help in the process, though they hardly scratch the surface. Often the programmer is forced to fill the knowledge gap in his free time. This adds up to the volume of overtime one must do on projects. On the long run it becomes challenging to find the needed time for learning.

In resource planning there’s the tendency to add or replace resources on projects, while neglecting the influence this might have on a project and its timeline. Each new resource needs some time to accommodate himself on the role, to understand project requirements, to take over the work of another. Moreover, resources are replaced on project with a minimal or even without the knowledge transfer necessary for the job ahead. Unfortunately, same behavior occurs in consultancy as well, consultants being moved from one known functional area into another unknown area, changing the resources like the engines of different types of car, expecting that everything will work as magic.



11 May 2019

#️⃣Software Engineering: Programming (Part X: The Dark Side)

Software Engineering
Software Engineering Series

As member of programmers' extended community, it’s hard to accept some of the views that inconsiderate programmers and their work. In some contexts, maybe the critics reveal some truths. It’s in human nature to generalize some of the bad experiences people have or to oversimplify some of programmers’ traits in stereotypes, however the generalizations and simplifications with pejorative connotations bring no service to the group criticized, as well to the critics.

The programmer finds himself at the end of the chain of command, and he’s therefore the easiest to blame for the problems existing in software development (SD). Some of the reasoning fallacies are equating the process of programming with programmers' capabilities, when the problems reside in the organization itself – the way it handles each step of the processes involved, the way it manages projects, the way it’s organized, the way it addresses cultural challenges, etc.

The meaningful part of the SD starts with requirements’ elicitation, the process of researching and discovering the requirements based on which a piece of software is built upon. The results of the programming process are as good as the inputs provided – the level of detail, accuracy and completeness with which the requirements were defined. It’s the known GIGO (garbage in, garbage out) principle. Even if he questions some of the requirements, for example, when they are contradictory or incomplete, each question adds more delays in the process because getting clarifying the open issues involves often several iterations. Thus, one must choose between being on time and delivering the expected quality. Another problem is that the pay-off and perception for the two is different from managerial and customers’ perspective.

A programmer’s work, the piece of software he developed, it’s seen late in the process, when it’s maybe too late to change something in utile time. This happens especially in waterfall methodology, this aspect being addressed by more modern technologies by involving the customers and getting constructive feedback early in the process, and by developing the software in iterations.

Being at the end of the chain command, programming is seen often as a low endeavor, minimizing its importance, maybe because it seems so obvious. Some even consider that anybody can program, and it’s true that, as each activity, anyone can learn to program, same as anyone can learn another craft, however as any craft it takes time and skills to master. The simple act of programming doesn’t make one a programmer, same as the act of singing doesn’t make one a singer. A programmer needs on average several years to achieve an acceptable level of mastery and profoundness. This can be done only by mastering one or more programming languages and frameworks, getting a good understanding of the SD processes and what the customers want, getting hand-on experience on a range of projects that allow programmers to learn and grow.

There are also affirmations that contain some degrees of truth. Overconfidence in one’s skills results in programmers not testing adequately their own work. Programmers attempt using the minimum of effort in achieving a task, the development environments and frameworks, the methodologies and other tools playing an important part. In extremis, through the hobbies, philosophies, behaviors and quirks they have, not necessarily good or bad, the programmers seem to isolate themselves.

In the end the various misconceptions about programmers have influence only to the degree they can pervade a community or an organization’s culture. The bottom line is, as Bjarne Stroustrup formulated it, “an organization that treats its programmers as morons will soon have programmers that are willing and able to act like morons only” [1].



References:
[1] "The C++ Programming Language" 2nd Ed., by Bjarne Stroustrup, 1991

29 April 2019

💼Project Management: Planning Correctly Misunderstood - Part I

Mismanagement

It is sometimes helpful to take a step back, observe, and then logically generalize the extremes of the observed facts; if possible, without judging people’s behavior as there’s more to it as the eyes can perceive. In some cases however one can feel that the observed situations are really close to extreme. It’s the case of some tendencies met in project planning - not planning, planning for the sake of planning, expecting a plan to be perfect, setting a plan as fix, without the possibility of changing it in utile time, respectively changing the plan too often.

There are situations in which it’s better to be spontaneous and go with the flow. Managing a project isn’t one of these situations. As Lakein’s Law formulates it succinctly: “failing to plan is planning to fail”, or paraphrasing Eisenhower (1) and Clausewitz (2) - plans are useless as no plan ever survived contact with the enemy (reality), but planning is indispensable - as a plan increases awareness about project’s scope, actions, challenges, risks and opportunities, and allows devising the tactics and logistics needed to reach the set goals. Even if the plan doesn’t reflect anymore the reality, it can still be adapted to fit the new requirements. The more planning experience one has the more natural it becomes to close the gap between the initial plan and reality, and of adapting the plan as needed.

There’s an important difference between doing something because one is forced to do it and doing it because one sees and understands the value of planning. There's the tendency to plan for the sake of planning, because there's the compel to do it. Besides the fact that it documents the what, when, why and who, and that is used as a basis for action, the plan must reflect project’s current status and the activities planed for the next reporting cycle. As soon a plan is not able to reflect these aspects it becomes thus in time unusable.

The enemy of a good plan can prove to be the dream of a perfect plan (3). Some may think that the holy grail of planning is the perfect plan, that the project can’t start until all the activities were listed to the lowest detail and the effort thoroughly assigned. Few plans actually survive the contact with the reality and there can be lot of energy lost by working on the perfect plan.

Another similar behavior,  rooted mainly in the methodologies used, is that of not allowing a plan to be changed for a part or whole duration of the project. Publilius Syrus recognized more than two millennia ago that a plan that admits no modification is a bad plan (4) per se. Methodologies and practices that don’t allow a flexible way of changing the plan make no service to projects. Often changes need to occur immediately and not at an ideal point in time, when maybe the effect is lost.

Modern Project Management tools allow building the dependencies between the various activities and it’s inevitable that a change in one place will cause a chain reaction and lead to a contraction or dilatation of the plan, and this can happen with each planning iteration. In extremis the end date will alternate as the lines of a seismograph during an earthquake. It’s natural for this to happen in projects in a first phase, however it’s in Project Manager’s attribution to mitigate such variations.

The project plan is a reflection of the project and how it’s managed, therefore, one needs to give it the proper focus, how often and how detailed required.

Referenced quotes:
(1) “In preparing for battle I have always found that plans are useless, but planning is indispensable” (Eisenhower quoted by Nixon)
(2) “No plan ever survived contact with the enemy. ” (Carl von Clausewitz)
(3) “The enemy of a good plan is the dream of a perfect plan.” (Carl von Clausewitz)
(4) "It's a bad plan that admits of no modification." (Publilius Syrus)

22 April 2019

💼Project Management: Tools (Part I: The Choice of Tools in Project Management)

Mismanagement

Beware the man of one book” (in Latin, “homo unius libri”), a warning generally attributed to Thomas Aquinas and having a twofold meaning. In its original interpretation it was referring to the people mastering a single chosen discipline, however the meaning degenerated in expressing the limitations of people who master just one book, and thus having a limited toolset of perspectives, mental models or heuristics. This later meaning is better reflected in Abraham Maslow adage: “If the only tool you have is a hammer, you tend to see every problem as a nail”, as people tend to use the tools they are used to also in situations in which other tools are more appropriate.

It’s sometimes admirable people and even organizations’ stubbornness in using the same tools in totally different scenarios, expecting though the same results, as well in similar scenarios expecting different results. It’s true, Mathematics has proven that the same techniques can be used successfully in different areas, however a mathematician’s universe and models are idealistically fractionalized to a certain degree from reality, full of simplified patterns and never-ending approximations. In contrast, the universe of Software Development and Project Management has a texture of complex patterns with multiple levels of dependencies and constraints, constraints highly sensitive to the initial conditions.

Project Management has managed to successfully derive tools like methodologies, processes, procedures, best practices and guidelines to address the realities of projects, however their use in praxis seems to be quite challenging. Probably, the challenge resides in stubbornness of not adapting the tools to the difficulties and tasks met. Even if the same phases and multiple similarities seems to exist, the process of building a house or other tangible artefact is quite different than the approaches used in development and implementation of software.

Software projects have high variability and are often explorative in nature. The end-product looks totally different than the initial scaffold. The technologies used come with opportunities and limitations that are difficult to predict in the planning phase. What on paper seems to work often doesn’t work in praxis as the devil lies typically in details. The challenges and limitations vary between industries, businesses and even projects within the same organization.

Even if for each project type there’s a methodology more suitable than another, in the end project particularities might pull the choice in one direction or another. Business Intelligence projects for example can benefit from agile approaches as they enable to better manage and deliver value by adapting the requirements to business needs as the project progresses. An agile approach works almost always better than a waterfall process. In contrast, ERP implementations seldom benefit from agile methodologies given the complexity of the project which makes from planning a real challenge, however this depends also on an organization’s dynamicity.
Especially when an organization has good experience with a methodology there’s the tendency to use the same methodology across all the projects run within the organization. This results in chopping down a project to fit an ideal form, which might be fine as long the particularities of each project are adequately addressed. Even if one methodology is not appropriate for a given scenario it doesn’t mean it can’t be used for it, however in the final equation enter also the cost, time, effort, and the quality of the end-results.
In general, one can cope with complexity by leveraging a broader set of mental models, heuristics and set of tools, and this can be done only though experimentation, through training and exposing employees to new types of experiences, through openness, through adapting the tools to the challenges ahead.

21 April 2019

💼Project Management: Project Planning (Part II: Planning Correctly Misunderstood II)

Mismanagement

Even if planning is the most critical activity in Project Management it seems to be also one of the most misunderstood concepts. Planning is critical because it charters the road ahead in terms of what, when, why and who, being used as a basis for action, communication, for determining the current status in respect to the initial plan, as well the critical activities ahead.

The misunderstandings derive maybe also from the fact that each methodology introduces its own approach to planning. PMI as traditional approach talks about baseline planning with respect to scope schedule and costs, about management plans, which besides the theme covered in the baseline, focus also on quality, human resources, risks, communication and procurement, and separate plans can be developed for requirements, change and configuration management, respectively process improvement. To them one can consider also action and contingency planning.

In Prince2 the product-based planning is done at three levels – at project, stage, respectively team level – while separate plans are done for exceptions in case of deviations from any of these plans; in addition there are plans for communication, quality and risk management. Scrum uses an agile approach looking at the product and sprint backlog, the progress being reviewed in stand-up meetings with the help of a burn-down chart. There are also other favors of planning like rapid application planning considered in Extreme Programming (XP), with an open, elastic and undeterministic approach. In Lean planning the focus is on maximizing the value while minimizing the waste, this being done by focusing on the value stream, the complete list of activities involved in delivering the end-product, value stream's flow being mapped with the help of visualization techniques such as Kanban, flowcharts or spaghetti diagrams.

With so many types of planning nothing can go wrong, isn’t it? However, just imagine customers' confusion when dealing with a change of methodology, especially when the concepts sound fuzzy and cryptic! Unfortunately, also the programmers and consultants seem to be bewildered by the various approaches and the philosophies supporting the methodologies used, their insecurity bringing no service for the project and customers’ peace of mind. A military strategist will more likely look puzzled at the whole unnecessary plethora of techniques. On the field an army has to act with the utmost concentration and speed, to which add principles like directedness, maneuver, unity, economy of effort, collaboration, flexibility, simplicity and sustainability. It’s what Project Management fails to deliver.

Similarly to projects, the plan made before the battle seldom matches the reality in the field. Planning is an exercise needed to divide the strategy in steps, echelon and prioritize them, evaluate the needed resources and coordinate them, understand the possible outcomes and risks, evaluate solutions and devise actions for them. With a good training, planning and coordination, each combatant knows his role in the battle, has a rough idea about difficulties, targets and possible ways to achieve them; while a good combatant knows always the next action. At the same time, the leader must have visibility over fight’s unfold, know the situation in the field and how much it diverged from the initial plan, thus when the variation is considerable he must change the plan by changing the priorities and make better use the resources available.

Even if there are multiple differences between the two battlefields, the projects follow the same patterns of engagement at different scales. Probably, Project Managers can learn quite of a deal by studying the classical combat strategists, and hopefully the management of projects would be more effective and efficient if the imperatives of planning, respectively management, were better understood and addressed.

09 June 2018

📦Data Migrations (DM): Guiding Principles

Data Migration
Data Migrations Series

Introduction

“An army of principles can penetrate where an army of soldiers cannot."
Thomas Paine

In life as well in IT principles serve as patterns of advice in form of general or fundamental ideas, truths or values stated in a context-independent manner. They can be used as guidelines in understanding and modeling the reality, the world we live in. With the invasion of technologies in our lives principles serve as a solid ground on which we can build castles – solutions for our problems. Each technology comes with its own set of principles that defines in general terms its usage. That's why most of the IT books attempt to catch these sets of principles. Unfortunately, few of the technical writers manage to define some meaningful principles and showcase their usages.

Many of the ideas considered as principles in papers on Data Migration (DM) are at best just practices, and some can be considered as best/good practices. Just because something worked good in a previous migration doesn’t mean automatically that the idea behind the respective decision turns automatically in a principle. Some of the advices advanced are just lessons learned in disguise. Principles through their generality apply to a broad range of cases, while practices are more activity specific.

A DM through its nature finds its characteristics at the intersection of several area - database-based architecture design, ETL workflows, data management, project management (PM) and services. From these areas one can pull a set of principles that can be used in building DM architectures.

Architecture Principles

“Architecture starts when you carefully put two bricks together.”
Ludwig Mies van der Rohe

There are several general principles that apply to the architecture of applications, independently of the technologies used or the industry, e.g. research first, keep it simple/small, start with the end in mind, model first, design to handle failure, secure by design (aka safety first), prototype, progress iteratively, focus on value, reuse (aka don't reinvent the wheel), test early, early feedback, refactor, govern, validate, document, right tool – right people, make it to last, make it sustainable, partition around limits, scale out, defensive coding, minimal intervention, use common sense, process orientation, follow the data, abstract, anticipate obsolescence, benchmark, single-responsibility, single dispatch, separation of concerns, right perspective.

To them add a range of application design characteristics that can be considered as principles as well: extensibility, modularity, adaptability, reusability, repeatability, modularity, performance, revocability, auditability, subject-orientation, traceability, robustness, locality, heterogeneity, consistency, atomicity, increased cohesion, reduced coupling, monitoring, usability, etc. There are several principles that can be transported from problem solving into design - divide and conquer, prioritize, system’s approach, take inventory, and so on.

A DM’s architecture has more to do with a data warehouse as it relies heavily on ETL tasks and data need to be stored for various purposes. Besides the principles of good database design, a few other principles apply: model (the domain) first, denormalize, design for performance, maintainability and security, validate continuously. From ETL area following principles can be considered: single point of processing, each step must have a purpose, minimize touch points, rest data for checkpoints, leverage existing knowledge, automate the steps, batch processing.

 In addition, considering their data-specific character, a DM can be regarded as one or several data products, though in contrast with typical data products DM have typically a limited purpose. From this area following principles could be considered: build trust with transparency, blend in, visualize the complex.

Data Management Principles

Considering that a DM’s focus is an organization's data, some principles need to focus on the management and governance of Data. Data Governance together with Data Quality, Data Architecture, Metadata Management, Master Data Management are functions of Data Management. The focus is on data, metadata and their lifecycle, on processes, ownership and roles and their responsibilities. With this in mind there can be defined several principles supposed to facilitate the functions of Data Management: manage data as asset, manage data lifecycle, the business owns the data, integration across the organization, make data/metadata accessible, transparent and auditable processes, one source of truth.

As part of DM there are customer, employee and vendor information which fall under the General Data Protection Regulation (GDPR) EU 2016/679 regulation which defines the legal framework for data protection and privacy for all individuals within the European Union (EU) and the European Economic Area (EEA) as well the export of personal data outside the EU and EEA. The regulation defines a set of principles that make its backbone: fairness, lawfulness and transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity and confidentiality, accountability [6].

Overseas, the US Federal Trade Commission (FTC) issued in 2012, a report recommending organizations design and implement their own privacy programs based on a set of best practices. The report reaffirms the FTC’s focus on Fair Information Processing Principles, which include notice/awareness, choice/consent, access/participation, integrity/security, enforcement/redress [6].


Project Management (PM) Principles

"Management is doing things right […]"
Peter Drucker

A DM though its characteristics is a project and, to increase the chances of success, it needs to be managed as a project. Managing DM as a project is one of the most important principles to consider. The usage of a PM framework will further increase the chances of success, as long the framework is adequate for the purpose and the organization team is able to use the framework. PMI, Prince2, Agile/Scrum/Kanban are probably the most used PM methodologies and they come with their own sets of principles.

In general, all or some of the PM principles apply independently on whether is used alone or in combination with other PM methodologies: a single project manager, an informed and supportive management, a dedicated team of qualified people to do the work of the project, clearly defined goals addressing stakeholders’ priorities, an integrated plan and schedule, as well a budget of costs and/or resources required [1].

On the other side, an agile approach could prove to be a better match for a DM given that requirements change a lot, frequent and continuous deliveries are needed, collaboration is necessary, agile processes as well self-organizing teams can facilitate the migration. These are just a few of the catchwords that make the backbone of the Agile Manifesto (see [3]).

An agile form of Prince2 could be something to consider as well, especially when Prince2 is used as methodology for other projects. For Prince2 are the following principles to consider: continued business justification, learn from experience, defined roles and responsibilities, manage by stages, management by exception, focus on products, tailor to suit the project environment [2].

All these PM principles reveal important aspects to ponder upon, and maybe with a few exceptions, all can be incorporated in the way the DM project is managed.


Service Principles

Considering the dependencies existing between the DM and Data Quality as well to the broader project, a DM can have the characteristics of a service. It’s not an IT Service per se, as IT only supports technically and eventually from a PM perspective the project. Even if a DM is not a ITSM service, some of the ITIL principles can still apply: focus on value, design for experience, start where you are, work holistically, progress iteratively, observe directly, be transparent, collaborate and keep it simple [4].


Conclusion

“Obey the principles without being bound by them.”
Bruce Lee

Within a DM all the above principles can be considered, though the network of implication they create can easily shift the focus from the solution to the philosophical aspects, and that’s a marshy road to follow. Even if all principles are noble, not all can be considered. It would be utopic to consider each possible principle. The trick is to identify the most “important” principles (principles that make sense) and prioritize them according to existing requirements. In theory, this is a one-time process that involves establishing a “framework” of best/good practices for the DM, in next migrations needing only to consider the new facts and aspects.

Previous Post <<||>> Next Post

References:
[1] “Principles of project management”, by J. A. Bing, PM Network, 1994 (link)
[2] Axelos (2018) What is PRINCE2? (link)
[3] Agile Manifesto (2001) Principles behind the Agile Manifesto (link)
[4] Axelos (2018) ITIL® Practitioner 9 Guiding Principles (link)
[5] The Data Governance Institute (2018) Goals and Principles for Data Governance (link) 
[6] Navigating the Labyrinth: An Executive Guide to Data Management, by Laura Sebastian-Coleman for DAMA International, Technics Publications, 2018 (link)  

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.