PetabyteChris Anderson, the editor of Wired, has written an excellent article entitled "The End of Theory: The Data Deluge Makes Scientific Method Obsolete" in which he convincingly argues that massive amounts of data, in combination with sophisticated algorithms and super powerful computers, offers mankind a whole new way of understanding the world.

Anderson believes that our technological tools have now progressed to the point where the "old way" of doing science — hypothesize, model and test — is becoming obsolete. In its place, a new paradigm is now emerging whereby scientists, researchers and entrepreneurs simply allow statistical algorithms to find patterns where science cannot.

If Anderson is correct — and I believe he very well could be — this will take unlearning to a whole new level. In short, instead of having to do the unlearning ourselves (e.g. when our hypothesis doesn’t work out or is proved wrong) scientists and researchers can instead rely on intelligent algorithms to do the heavy lifting.

The implication for a field such as biology which, as Anderson points out is actually becoming more difficult to model as learn more about it (due to our limited understanding of how genetics, microbes, personal behavior, the environment and a host of other factors work in partnership to determine a person’s health), could be profound.

More specifically for the field of unlearning, this means we will be able to analyze data without allowing hypotheses (which may be wrong) to cloud our view of what the data is really showing us.

Before this vision can be achieved, however, it will require a great many brilliant scientists to unlearn the idea that their "model-based" method of trying to make sense of today’s increasingly complex world is the only way to search for new meaning. Increasingly, we will be able to allow computers to crunch the data and extract the meaning for us.