When faced with a problem a programmer will ponder the cause and effect of the situation. They will look at various sides of it, even while it may be rapidly increasing, and look for its nature and its final outcome. Programmers understand that once you start a process, it has rippling effects, and even after you kill it- that process leaves a trace of itself. So making a rash decision or acting on assumptions without data is a bad idea.
After the problem is understood, the next step is to outline a solution. The solution is a set of goals or objectives the program needs to meet. Hopefully following these will lead to other unintentional successes as well. The success of the initial goals measure the success of the program. The main point to understand here is by compromising on a single goal, there is a risk of total failure. Each piece of a solution usually depends on the next. So the integrity of a program is also judged on the amount of compromise that was allowed; and integrity is what keeps a program alive when it is pushed to its predicted limits.
The next phase is to start working. Modules are made that focus on aspects of the solution and then they are connected so they can share information and influence one another based-on a set of rules. Testing and more testing happens in an environment that simulates the real-world. The idea is to be as close to real as possible, without causing any damage outside of the simulation. This is a natural trial-and-error process, and it mimics the most basic of all human learning.
The objectives are reviewed as the prototype emerges from the simulation, one being checked-off after the next. In the end, at least on paper, the goal is that all the objectives were met and that the data from the simulation suggests that in the real-world the solution will work.
Normally additional things are then considered, such as making the modules easier to access, easier to use, and then if time permits, aesthetics are applied.
The program emerges into the world, and is enabled to begin solving the problem-set it was designed for. Some parts are autonomous, others require human interaction. Data is constantly collected about the performance. Data comes in the form of logs sent by the program and feedback from the users. This data is studied and further changes are made. Changes come from data, and the best solution is chosen based-on the data. This may mean that the programmer has to learn something new, or apply a unique approach. Forcing a solution out of technological comfort is always going to compromise integrity. It will weaken the entire system, and eventually affect users.
Remember that the process began with an understanding of the problem, and the solution was not focused on brands or technology but on concepts and theory. There was always a goal of not compromising in order to preserve the integrity of the solution. The prototype was tested in an environment that was as close to real as possible. Data was driving all changes.
I think more processes need to follow this pattern. I feel like the method of thinking in code can apply to every aspect of education, school management, and day-to-day problem solving. It is not always possible to collect data as long as you want, to test ideas as long as you want, or even to get feedback as quickly as you want. But adhering to the concept and developing the discipline to achieve a data driven solution seems like it would filter out the noise and emotion that often clouds the best solution.