Saturday, March 21, 2009
Wednesday, March 18, 2009
The Electronic Frontier Foundation's recent decision to create an online repository of all of the Freedom of Information Act documents it has collected over the years got me to thinking about other such efforts in both the government and private sectors.
Archives such as the EFF's are full of primary source documents and are excellent teaching and research resources. Here is the list of such archives I could think of off the top of my head:
EFF's FOIA Document Archive. Primarily designed to "shed light on controversial government surveillance programs, lobbying practices, and intellectual property initiatives."
George Washington University's National Security Archive. One of the oldest and best sources for formerly classified documents and expert analysis of their impact/import to historical events.
Federation Of American Scientists' Secrecy Project. While not always FOIA generated, the FAS keeps tabs on the policies and practices surrounding classified information and often posts primary source documents relevant to intelligence community operations.
Wikileaks. Actually, this is NOT FOIA generated as the material here often has not yet been de-classified or authorized for distribution. Still, very much worth knowing about.
CIA Reading Room. Most people do not know that the various intelligence agencies in the US often maintain a FOIA reading room on their websites (I list the CIA here as a particularly good, but by no means the only, example).
The National Archives' Checklist. From an intelligence and classified documents standpoint, this is a hit or miss kind of thing. I always find something interesting but I always wish it were a bit easier or that I could find a bit more.
Do you have any others? List 'em in the comments!
Monday, March 16, 2009
The fundamental question for an intelligence analyst is, "What do I think is likely to happen?" Analytic confidence, on the other hand, answers the question "How likely is it that my answer to the first question is incorrect?"
I have argued that intelligence analysts need to answer both of these questions to have a theoretically complete intelligence estimate -- an estimate that is as transparent as possible to the decisionmaker the intelligence analyst supports.
Tristan de Frondeville, in his article for Edutopia called Ten Steps To Better Student Engagement, makes a similar case -- and for surprisingly similar reasons -- when it comes to assessing a student's work.
While Frondeville's article is about a variety of techniques for increasing student engagement in learning (and well worth the read), the part that caught my eye was about halfway down, labelled with the subtitle: Teach Self Awareness About Knowledge.
De Frondeville uses the little graphic (which he designed) at the left with all of the questions on his tests. De Frondeville writes, "After the students answer a question, have them place an X on the line to represent how sure they are that their answer is correct. This approach encourages them to check their answer and reflect on their confidence level. It is informative when they get it wrong but marked "for sure" or when they do the opposite and mark "confused" yet get the answer right." This approach is also remarkably similar to the approach developed (independently) by Josh Peterson for use in his thesis on the appropriate elements of analytic confidence.
For both the analyst and the student, the purpose is largely the same -- to give as complete a picture as possible with regard to their current understanding of a topic. Both students and analysts who indicate a low confidence score are signalling that, while they might get the answer "right", that may well be due to luck as their perception is that one or more of the factors commonly associated with analytic or educational confidence is missing.
From an educational standpoint, it makes an enormous amount of sense. It not only teaches self-reflection/awareness and critical thinking but also provides the teacher with crucial information about a student's own perception of their progress. Taken individually, the additional information allows a teacher develop a more complete assessment of a student's learning. Taken in aggregate, the results could help rapidly identify areas in a lesson plan that need more or less attention.
I intend to try it as soon as possible in my courses. If anyone else gives it a try (or has already tried a similar method), please drop a comment here about your experience.