By Anthony Woollacott at October 03 2018 16:17:39
It is usual that knowledge processes take the form of projects to manage their execution. If the output of the process is a unique product, managing work as a project will result in obvious advantages. There are certain guidelines that can help an organization willing to improve their knowledge processes: Provide process description on how to approach work Try to figure out the best way to carry out a knowledge process, by making the best practices existing in your organization (or in your industry) explicit. Publish process definitions in a format that is easy to consult and understand. Provide tools that facilitate and standardize work Decide which tools are best to help knowledge workers carry out their work. Involving all affected knowledge workers in the process of deciding which tools will be used is very convenient, in order to obtain user buy in.
The key to process improvement is to clearly communicate process definitions (the way in which the company wants the processes to be carried out) to the people in charge of their execution (through training, process descriptions publication, etc...). The better process participants understand the process definition, the higher the probability that the process is carried out according to it. They are better implemented through obtaining buy-in than through imposing directives.
They are more difficult to implement through discipline than administrative human-centric processes (although some discipline is needed). It is better to focus on obtaining buy-in from the people affected by the processes through early involvement, communication and expectations management. It is a known fact that knowledge workers are reluctant to change their habits. Some say knowledge workers don't like following procedures because they feel it limits their creativity; but most of the time they will be happy to follow a procedure as long as they see value in it, perceiving that it helps them work better and produce a better process output.
The possibility of developing some such artifact has intrigued human beings since ancient times. With the growth of modern science, the search for AI has taken two major directions: psychological and physiological research into the nature of human thought, and the technological development of increasingly sophisticated computing systems. In the latter sense, the term AI has been applied to computer systems and programs capable of performing tasks more complex than straightforward programming, although still far from the realm of actual thought. The most important fields of research in this area are information processing, pattern recognition, game-playing computers, and applied fields such as medical diagnosis.