The researchers suggest "The power behind the use of simulations in the life sciences is in the opportunity for students to explore “what-ifs” in ways that
enable the student to build schemas of understanding. The visualization of processes and structures reduces the cognitive load, enabling even novice learners to understand academic complexities."
This is a very bold statement: that using a simulation alone can result in a stronger learning experience than the 'real thing'. However, in an earlier post, Estrella pointed out that "simulations or virtual environments just recreate reality following the laws of physics but we should not forget they are NOT reality. They can take into account as much (variables) as the programmer knowledge allows."
So here is the $1000000 questions: At what point does 'real' data become essential to the learning of science?
I've been following the discussions in this forum with considerable interest.
Having worked in air traffic control for many years I'm a keen fan of simulations as a means of providing learning opportunities. Air traffic control has mission critical, safety, lives at stake implications and learning "on the job" using real data is not advisable. It's far more preferable to first get "checked out" as having the requisite skills and knowledge via the simulator before tackling real scenarios.
In the context of air traffic control simulation enables learning that is just not possible using "real" data.
I know that air traffic control -- and many of the other learning tasks in aviation -- make extensive use of simulations. So besides letting the learner explore 'what if's' (as described in the Cisco article), simulations enable the learner to practise handling situations that are dangerous or too critical to jump right into for real. I get the same nice warm feeling when I watch our nursing students practise giving injections to an orange.
What was the transition like, from air traffic simulations to the air control tower? Which skills (or attitudes?) had to be learned on the job?
PS: did I hear a rumour that it is your birthday today????
From there I went to Hughes Aircraft where I worked as part of world class engineering team developing large scale mission critical air traffic control systems (for Canada, Indonesia, Switzerland, China, ...). I was responsible for developing the education/training programs that went along with the sale and deployment of these systems. Simulation was a key learning tool there too (but only one of many methods used).
You ask what the transition was like from simulation to full air control tower. Its a good question. In the early days the air traffic control tower simulators I worked with were multi-million dollar systems. Graphics controllers generated full 360 degree views on wrap around screens that surrounded you. You could choose to simulate any major airport in the world, day or night conditions, weather (rain, snow, fog), type of aircraft, and all the audio communication involved. Needless to say this kind of simulation is high fidelity and allows for a high degree of reality but its expensive and massiveness limited its availability and capacity.
A great deal of the work I did in the ensuing years (80's and 90's) was around trying to replicate as much of that simulation experience as possible on personal computers for a fraction of the cost (even emulating game systems to try and make it fun). In addition a great deal of air traffic control is not done in the tower but at remote radar control centres (usually not even located at airports) where the planes are not controlled via visual flight rules but via instrument flight rules. The simulations for this kind of work are entirely different. All air traffic control is recorded and "incidents" are played back and reviewed. So in devising simulations it was possible to go from easy ones with very little air traffic to complex and difficult ones that replicate real incidents that happened.
I'd suggest that all of the above applies to most science simulations. The extent to which the simulation can replicate reality can be amazingly high. The other thing simulations enable is accumulation of experience and learning not possible any other way. Simulating air traffic at London Heathrow, Toronto Pearson's, and Chicago O'Hares for someone just starting out enables learning that could take place no other way. However, being checked out and graduating from simulations did not mean you were automatically qualified to control air traffic on your own. But you were qualified to control with supervision until you had acquired enough "real" experience to be deemed fit for solo control.
And finally, no, today is not my birthday, my birthday is Dec. 30. But we could pretend it is. Presents and cake welcome :)
Jumping in a bit late as this post was last week but wanted to get back to it as it is of interest on a few levels from simulating practice opportunities to transitioning to more real-life examples.
Paul, I am interested in your story about the student air traffic controllers who had trouble with visualizing in 3D and how you used simulations to teach this skill. I have been involved as an instructional designer with an interdisciplinary team of faculty (engineers, business, architecture, art/media) in a very interesting course for a large group of first year undergrads. The course is about "spatial thinking and communicating in 2 & 3D" and as you have pointed out, this is a crucial skill for not only air traffic controllers but scientists of all description including: medical doctors, pharmacists, engineers, architects, nurses, epidemiologists, geographers who use GPS etc).
Although we didn't use computer simulations in the way you have described, we unpacked the numerous skills involved in spatial thinking and communicating and had students practice these skills using real technologies that they may find in a work place: sketching with pen and paper, drawing with google sketchup, building lego models with digital lego (this of course is not found in the work place but it is a great tool for preparing students to use the more complex industry tools such as solid works). Finally students created digital AMTs and models with moving parts using solidworks software.
Student teams (were interdisciplinary from engineering, business, communications and digital arts) were required to "transfer" their digital models into physical objects using materials like paper, cardboard, plastics and everyday items. Although they had "practiced" the skills involved in conceptualizing the bits and pieces, when they came to putting it all together and moving from digital to physical they discovered constraints that had not been accounted for in the digital process. In other words, there was no direct transfer from digital to physical. Similar to your experience where the air traffic control students become capable in a simulation but had to have real practice to develop competency in real situations.
This was a great learning for these students as they had to work together, problem solve and come up with a product. There were loads of conversations in the labs with the instructional team and TA's about supporting learning when moving from digital to physical space.
During a quality circle where we debriefed this course the instructional team decided to reduce the lectures and content delivery and enhance the practical side. The additional practice with other digital and physical tools will strengthen the transfer of concepts to practice. It will be interesting to follow-up after the next term to see if the changes made a difference in the student learning outcomes.
all for now,
I remember doing a molecule-building exercise in SecondLife. The atoms were about the size of balls (from ping-pong to beach-ball size) & the molecules had to be assembled in the air above me. It was an interesting sensation of 3-D, & I did learn something about molecules, but of course my learning was not tempered by various forces of reality (like super-sub-microscopic size, for example!)
It really does seem that some skills must be learned 'through the fingers'; i.e. just reading about it or practising with a keyboard & 2-D computer screen is not enough. What do you think about the potential of haptic devices (haptics are "any device that provides a touch-based system of interaction with virtual environments")? This SFU Master's student used haptic devices to augment a virtual frog dissection & found that students who used the haptic devices completed the dissections about 20% faster, cut more accurately & learned faster. They are used a lot for surgical training.
Has anyone out there tried a haptic device? Apparently the cost is only about $100...
Yes, sorry you are correct - it was late last night when I was getting into this description. The greatest learning was in fact in the analysis of what happened between digital and physical. I think that was not really explicit in our design so it was nice to see this piece of interpretive work emerge in the lab. The resulting conversations between students and faculty was fascinating.
The design team has decided to explore the student artifacts for a paper and in so doing really unpack the assessment strategies and lab activities to see if they actually did design opportunities for students to really practice spatial thinking.
Thanks for the info on haptic devices, I have never heard of this. It looks interesting.
I don't think haptic devices are necessarily well-known, even to those in the sciences. A haptic device is an input device (as is a mouse or a keyboard) that provides feedback to the user about the tactile properties of a virtual object. For example, if you are manipulating an object on the internet using a haptic device, you could 'feel' if the object was hard or soft, heavy or light, from the pressure you felt back from the device. The link provided in my post earlier this morning provides more information if you want. If you want to see what such a device actually looks like, this link is good (http://haptic.edutechie.com/novint_falcon/).
Hope this helps!
Nothing that sophisticated. The ELMO camera is the next generation of overhead projectors. It's advantage is that it shows objects, in 3D and colour; no slide preparation is required. So, if--for example--a teacher thought it would be useful to show students a pencil, she'd just pop it under the camera and the object would be projected onto a screen or even a wall. Low-tech compared to the equipment you are using.
As for a link, tomorrow's task. I'll have to think about where best to send you.
I used the term camera and I should have said projector.
When I "googled" ELMO projector, all I picked up were sales sites. Nevertheless, they do show what the projector looks like and explain the basic principles. No one sales site was any better than the others, so I'm not sure what to send to you that'll be helpful.
This link may be useful, though: http://www.elmousa.com/presentation
Imagine a pen, held in the middle by a long arm; one with joints at the top, middle & bottom, a bit like an arm, really, or an angle poise lamp. HOwever, the joints can move in all directions, not just limited ones. It's fixed on the desk.
You hold the pen, but look at a computer screen that's got an image on. I've usually used it when it's got a grey block on it. You can move the pen quite easily till you get to the block, then it gets hard. Not impossible; the block isn't solid. The block is like clay, so you can use the pen to gouge lumps out & carve it into whatever shape you want. Move the pen away, and it moves easily. Move back to touching the block, and you get resistance.
Of course, it can be programmed to do things far more complex than creating a bit of art work. For example, the "pen" could be a scalpel & the block of clay a bit of body.
If you go to http://www.ceetee.net/ and click on the Virtual Reality Centre online, (it's a shockwave file), and then click on Facilities (the links move a bit, it was a design student who built it, not one overly worried about accessibility!) There are some videos of the VR facilities, including the haptic devices. It includes the pen thing that I was trying to describe!
In the 3d screens video, you can see someone using a headset that gives a 3d view - with stonehenge in the background. That's really weird; I've used that one. You think you're going to crash into the stones, so, as you're walking across the room, you stop. (Quite handily really, or you might really crash into the expensive screens!). However, it looks rather odd to the on lookers!
When they do the demo (just on the screens with normal 3D glasses, not the special ones for walking about), it includes a massive bee heading straight towards you. Looks (and sounds!) incredibly scary.
The 3-D screens video is really impressive. I don't quite understand how the polarizing filters & various lenses & projections etc. all work together, but the image of the man 'moving' through Stonehenge is very compelling. I think there could be all sorts of educational applications for this. You'd think it could even be effective for air traffic control learning...?
Sounds like a fantastic course you are involved in.
Our air traffic control challenge was enabling students to mentally convert 2D symbols of aircraft depicted on a radar screen into 3D air space mental models. I won't get too techie here but air traffic control involves maintaining vertical and horizontal separation between aircraft. Essentially you need to visualize each airplane as being surrounded by a volume of airspace that maintains separation with other planes a minimum of 1,000 feet above and below the plane as well as a certain amount of space in front of and behind the plane. Imagine a cardboard box (representing protected air space) surrounding each plane with the plane suspended in the middle. This is all fine when there are only a few aircraft but rapidly gets complex when there are many planes. A further complication is that air traffic control requires you to not only see where the plane is now but to also extrapolate forward where it will be in the future so that directives to change a flight path can be given well in advance. This forward extrapolation is complicated by the fact that each plane is traveling at different speeds. OK enough already. So the simulations partly involved creating representations of all the aircraft with their volume of protected air space surrounding them and showing projected paths for each plane based on its speed.
You point out that students often have trouble transferring digital experience to physical but in the case of air traffic control the task really is entirely digital. Most air traffic control is done entirely based on digital data. You also note that decisions were made to reduce lectures and enhance the practical side. We did that too. Essentially a series of simulations were developed that started out with just a few aircraft. Then when you mastered that many more were added until the level of complexity simulated real world scenarios. So students progressed from basic to advanced.
Its been many years since I did all this work so I expect things are even more advanced now.