This idea connects with some of what has already been said, but it does provide a useful label, I think.
McNamara's fallacy http://en.wikipedia.org/wiki/McNamara_fallacy
i.e. ignore what can't be measured and assume it isn't important.
Some discussion from last year of this fallacy as it applies to analytics and the idea of engagement
http://davidtjones.wordpress.com/2010/08/09/the-mcnamara-fallacy-and-pass-rates-academic-analytics-and-engagement/
Or in a similar vein:
Learning and Learning Analytics
"Not everything that counts can be counted, and not everything that can be counted counts." (Sign hanging in Einstein's office at Princeton)
Learning and Learning Analytics
"Not everything that counts can be counted, and not everything that can be counted counts." (Sign hanging in Einstein's office at Princeton)
David,
Thanks for suggesting we listen to David Snowdown's podcasts in your weblog entry "Analytics--Creating too much Transparency".
One of his presentations is particularly relevant in this thread.
Judgment and Resilience
Mary
Thanks for suggesting we listen to David Snowdown's podcasts in your weblog entry "Analytics--Creating too much Transparency".
One of his presentations is particularly relevant in this thread.
Judgment and Resilience
Mary