Posts tagged: Conference

What I learned from Twitter this week

comments Comments Off on What I learned from Twitter this week
By , June 30, 2009 6:02 pm

Over the last week, EdMedia was taking place in Hawaii and Twitter was able to deliver some of the key ideas, enabling those who couldn’t be there to vicariously participate. One of my tweets wound up on top of the screen during one of the keynotes, quickly garnering some tweetbacks from the conference – it certainly felt like I was virtually there.

This meme propogation and participation isn’t anything new for Twitter, but the utility of the backchannel in a conference certainly became real for me as a remote observer. There have been other conferences to use this, but EdMedia was likely one of the first Edtech conferences to do it and do it well.

Participation in world events like MJ’s death and the Iranian “election” were also fully featured in Twitter this week, both massively stressing the system. Participation in personal events, like Adam Savage’s AT&T bill also showed how companies can no longer afford to be “cloak and dagger” with their activities as one popular or motivated follower can bring attention to even the slightest missive.

AALT 2008

comments Comments Off on AALT 2008
By , May 3, 2008 10:08 am

Thanks to everyone who came out to my presentations.  For those who want the slides, please contact me through this blog and let me know which format you would like to have them in  – or if you don’t know, just tell me to sent it all. 

Technology Conference Calendar

comments Comments Off on Technology Conference Calendar
By , March 13, 2008 2:19 pm

I remember it was a bear putting together a list of conferences last year for a funding proposal. The year, I think I’ll use this – the THE EdTech conference listing. You can search for events by location as well so if you want to “confrenciday” you can find your destinations much easier. 

Why Game

By , July 6, 2007 2:41 pm

A couple weeks ago at the CeC, Chris Melissinos – Sun Microsystems’ CTO of Web 2.0 and Chief Gaming Officer and a self-proclaimed video game addict – gave quite the keynote and while I dutifully took notes in a linear fashion and while, I would have normally provided those notes straight, with a week having past, I’m slightly calmed down from the hype and offer a reflection on topics raised in the lecture. The core elements are from Chris, but I have fused together some of my own ideas where I have found an opportunity. So Chris, if you ever read this, please let me know what you think.

Most human cultures have had games that have been used to provide a means of training the youth or inexperienced for situations that might occur later in life (either in general or specific situations). Humans are not unique in this, many animals also play as a form of learning how to gather food, defend against attack etc. The one common denominator across the species, with the exception of a few hunting games, games are played with others of one’s own species. In human terms, sports, the common rallying cry of the anti game movement are in fact games that are played to prepare for war. What games have allowed us as a society to do is explore and discuss experiences without having to deal with the unfortunate consequences – like death. Ironically, computer based games, blamed for so many evils do the same thing – even allowing for death – but with one key difference until recently (long ago in ‘net time), they were all played alone and discussed in small groups.

The 1970s saw the “bit babies”, the arrival of Pong and things would never be the same. Now an entire world, not limited by any physical limits, only the limits of what could be displayed on the screen arrived and games like pinball and well… anything else were on their way out. Perhaps even parents were on the way out – always telling us to “do something real” – not understanding this device and how it “enslaved” the minds of children. Not having grown up with these devices, most parents were just getting comfortable with TV at the time, so the computer was truly an odd duck. Games that kids played alone were even more so. This enslavement that was perceived by the parents was actually moving in both directions – children had enslaved their computers, programming them to do their simple bidding and the computers had enslaved the imaginations of children by providing them freedom and control (how ironic). This control was facilitated through programing languages that empowered this early symbiotic relationship and leading the world down a path with with more literacies (both new and old) to master than had existed before.

With games, even without the language, users have control over a world and compared to their own world where they have to listen to all manner of commands and lack any control, there is a real addictive (maybe not) character to this experience. They have to be cognoscente of a range of input stimuli and be literate (more recently) in as many of those (or at the very least semi-lingual in older games) as possible to succeed. When played alone, these games offer an escape from reality, but more recently, any game worth playing is being played together with other players around the world. It used to be that old games created communities where participants engaged in their activities and then communed after the fact. These were often small and “nerdy” groups that were easy to disregard.

***ok so I’m getting tired of this post – having let it sit for so long… so this might not make as much sense from here***

Fast forward to 2007 and these gaming communities are the games themselves, the communication is all in real time and the participants are legitimate successes in both the virtual world and “IRL” with players making the money that some traditional professions could only dream of. I wonder if Richard Garriott has been considered for any awards for starting the modern MMORPG with Ultima Online.

Looking to the future gamers 5-14 are actually driving the industry, all their friends are online at places like Club Penguin or now with systems like Wii the physical and other barriers to entering gaming are falling (with the wii, you have three generations of gamers who are able to collaborate). The ways that this generation will change the world is obviously yet to be seen, but we can get an idea based on Got Game (Beck & Wade 2002).

Even though games are great for some instructional purposes, they are not the be all and end all. If they are not used in an appropriate context, they have no value. So the trick is to understand those contexts, and these contexts do not include “edutainment” that forcefully grafts “learning” into a game. Kids don’t like the overt learning that is forced on them, not many people do, people like to feel that they are, on their own or with little help figuring out things.

Some people might think that games can make flashcards interesting because “games are just repetition” and while it is true that the brain gets tired of overt repetition – the mode that much of the instructional practice for the past few generations has been based – and once a pattern is found, it’s time to move on. Games add to this repetition some novelty and some risk to keep the brain interested. But I say this knowing that there are quite a number of “entitled” or otherwise lazy people out there that want to have others “learn for them” or provide their learning because they have “paid for it”. Edu-tainment/replacement games fail for this reason. It’s candy coated baby okra/brussel sprouts/what have you. But (in my view) if you look at Brain Age (especially on the DS which could be the ultimate learning machine), it’s a game that is math and matching and nothing more. It’s also freaking addictive and it’s also accessible, that is a game that is a game that happens to reinforce learning or help generate new connections for those who have not yet achieved mastery of the basic elements of the game – which by the way are not that different from flash cards – the difference? Flash cards are… cards that don’t flash, there is no reward of any kind, risk isn’t present and it’s too easy to “cheat”. There is no “fun” in the system (be sure to check out Koster’s book). These “schooly” games allow for exploration of knowledge, not just the presentation as per one defined view of that knowledge. But games are not the answer for every teaching and learning situation, each media has it’s strengths and weaknesses. people are willing to learn if there is some element of entertainment or novelty involved and time and other resources committed to entertainment are often quite “easy” to rationalize.

Other examples of these exploration based games are Peacemaker (my post) that takes it’s material from the middle east or Steer Madness that puts you in the hooves of a steer that has escaped from slaughter. These games, though based on models of commercial games like Command & Conquer and GTA, that are for the most part playable as a single player really gain their value as they encourage people to talk about their individual experiences after playing – like book clubs of old. But unlike books that have a linear presentation model, games are lateral or even non linear and this can often get people quickly into the mode of exploring and then talking about their explorations as one can find out that something is going to happen, but the spoiler may not be that it’s going to happen, but rather the method that you arrive there and the feeling that you had to get there through some measure of your own ability provides quite the reward. This exploration can allow people to learn faster and earlier than they would have using traditional methods.

*** ok it’s done – I’ll try to get some wheat from this chaff when I get the chance. When you get a chance, check out what Wes wrote on games and passion.

eLearning Conference Wraps

comments Comments Off on eLearning Conference Wraps
By , June 26, 2007 1:56 pm

Last week was busy with the move and everything else, so this is going to be a long post covering the gaming keynote (games are here to stay as gamers are raising gamers, “grafted edu-tainment” sucks), the themes and moods (pondering why people think that advocates of all new ideas give the impression to skeptics that the advocates want all training to enter their new mode – all gaming all the time? No! All ports all the time… No!…) of the days of the conference and some points on eportfolios (it needs some serious commitment from more than just one instructor and students need to care at the same time that instructors need to not care as much, after all it’s the student’s showcase, not their instructor’s).

Ha… this one isn’t long, I’ll have the three bits as their own posts.

Cheating online

comments Comments Off on Cheating online
By , June 19, 2007 3:48 pm

The second pre-service session is by John Krutsch, Senior Director, Distance Education, Utah Valley State College on academic integrity. Again, this is going to be one of the blog as I go posts. I’m hoping that the other session that I was interested in – how is your faculty PD. I’ve sent Dan over there and he’s sent the link to this wiki.

Personally, I think that cheating is something that is a part of life – depending on what part of life you are looking at and who is looking. John’s video that he started the presentation with seemed to support this idea – there are many other factors that influence why students cheat and how they justify their actions (grade curves, academic support et al). He went through a section that argued that being able to find answers on demand – once known as cheating in some environ, is now being allowed.

David Wiley of Utah State University says “If your students can cheat on you, then you deserve it!” – lead off a discussion on the instructors role. Personally, I think that the statement is true – garbage in, garbage out. If the instructor doesn’t create a question that supports a unique answer, then cheating is an option. If the questions never change, same thing.

The theory of “Margin” margin = load/power suggests something about why cheating might be encouraged. If there is more to deal with than one has the power to do something about, margin drops and to regain some margin, students might turn to cheating. Instructor generated load is a major issue as instructors don’t align what they want the students to do with what they need to get out of the course – busy work “kills”.

Suggested ways to get around cheating is to ensure clear expectations are delivered to students and managing how assessments are delivered to avoid providing “holes” to cheat through. He also suggests that varying assessment is a way to avoid cheating as well.

The video was interesting over all, but I was hoping for more “how’d they do that” type of thing.

Buzzwords Worth Knowing

comments Comments Off on Buzzwords Worth Knowing
By , June 19, 2007 11:42 am

The first preconference session for explore 2007 (5th Annual Canadian E-Learning Conference) is “Buzzwords worth knowing” by Brian Lamb, Emerging Technologies Discoordinator Office of Learning Technology, University of British Columbia. It’s looking to be a good session – I’ll notify it in real time… so this is going to be a different type of post – thank god for auto saving in WP2.2.

Getting through the introductions, Brian brought up an interesting point on the professionalizing of Facebook and the like as the last thing that an instructor needs is to be “eaten by the natives” as they enter the private world of the students. Part of the popularity of these social sites is that students have their own space and violating that with “school or work” can be dangerous. So this is certainly something that fits in with my thinking when we are using the system for cohorts – no “school” only social bits should be put there.

Moving into the second hour, much was made on Delicious with many people being really interested in the RSS abilities. RSS seems to be a real driver for people to come to this session as they want to be able to find ways to compress, filter and send content to their various peoples of interest.

After the break, Brian made the interesting point that blogs and RSS is a way to filter noise and create a network of experts to maintain a cloud of knowledge around you as a means of coping. He also mentioned the one(of many) issues with Facebook – they don’t allow or make it easy to export content from the system.

Overall, fun and interesting – likely there are a fair number of people that got some familiarity with at least the terms of 2.0 and have a slightly larger kernel of interest to start working with.

Some musings on ePorts

comments Comments Off on Some musings on ePorts
By , June 18, 2007 3:15 pm

I’ve been invited to be part of a panel talking about eportfolios this week at the Canadian eLearning  Conference (nee WebCT conference) and taking the role of the instructional designer, I’ve been thinking about eportfolios and some of the things that I might say are/might be guided by:

If they are implemented in more than on class, the eport becomes something that belongs to the student, it enables transfer between the topic areas that involved and because there is a “product” at the end of the day that the students can take ownership of.

Students go from “not another blogging assignment” or “not another portfolio system” to, where do I put this and make it really show off what I can do.

This of course means that for the instructor, they have to understand that they are sharing students with other instructors and that the student’s evaluation of the elements of their overall portfolio may see your content never appearing on the front page unless there is a really good reason for this. A knock to the ego? Perhaps? Good for the student? Certainly.

STHLE Day 2 – Keynote, Web2 and Technology Integration

comments Comments Off on STHLE Day 2 – Keynote, Web2 and Technology Integration
By , June 14, 2007 6:24 pm

Well today was an interesting day, attending one of the few non ed, education conferences; seeing how the non ed world is using and “discovering” what many of us with an education background have known and used many times over. So with that in mind, my refections on the day are not to explore the ideas that I want to explore as it is in other conference posts, but rather looking at ways that non ed people use ed language to get ed ideas across. With this in mind, the first part of the day was the keynote by Carl Wieman (Nobel, Wikipedia).

Wieman delivered an entertaining presentation on what basically could be understood as constructivism. While I was listening to it, I kept thinking to myself how technology and teaching are not really separate, they are one in the same and if you think one is being added to the other, it’s not really going to get you very far. But back to the presentation, which I’m reflecting on out of sequence.

The talk started out exploring how to really teach, one must change the way that students think. This is something that many grad and IIP advisers have had to face as they receive students who have amazing marks, but no knowledge of the field that their new environment is working in. Frustrated by this, Wieman decided to explore it using the scientific process and discovered that there is a different type of knowledge between the “booksmart” (my term) new student and the experienced “expert in field” (again, mine). When everything is boiled down, if you want to change how people think, you have to provide an environment for that change to occur. This is pretty easy in the lab as the expert and the novice share ideas and true problem solving can occur as there is little in the definition of what the problem truly is and what the answer is as well.  In the classroom however, this is not as easy. Students get to be experts in “process matching” rather than problem solving where they learn quickly to match their answers to what the instructors want (of course this isn’t saying that every student is like this) . In the classroom, there is no risk, context, transfer or language of the expert environment of the lab so the results obviously can not be the same.

That was essentially the crux of the issue when it comes to science (or any other) education process – experts are trying to cross between the expert and novice worlds, but they fail to notice the details of the environment that make them essentially different. But it’s not practical to transform a 300 student lecture to a 4 person lab, so instructors have to think about ways that they can communicate with their students, provide context and some element of risk or uncertainty and all at a rate that won’t “melt brains”.

At the end of the presentation, Wieman mentioned clickers as a ways to facilitate communication between the student body in the classroom and the instructor and attempt to “virtually” reduce the class size so that it might match more to what the lab environment is like (my brain is melting as I write this… ah!)

The rest of the day for me was about Web2 and tech integration.  The Web2.0 presentation explored the way that the ‘net that was once the “great library” is now the great conversation. Tools like the student created Otavo and WikiYork are only a couple of the examples to add to Flickr and the rest.

The last session was from Memorial and their Instructional Design “people”. For me, this was a great session as it verified the process that I use in the Faculty of Science based on Chickering:

  • identify instructional challenges
  • determine instructional team’s technology comfort level
  • define learning outcomes
  • consider student access
  • assess critical mass/tipping point
  • support through gradual roll out

Curriculum Mapping Pre-Conference Session – STLHE

comments Comments Off on Curriculum Mapping Pre-Conference Session – STLHE
By , June 13, 2007 4:08 pm

While the session I wanted to go to this morning was canceled, but the session ( Curriculum Mapping: The Path to Improving Student Learning and Faculty Effectiveness Through Curriculum AlignmentJohn Mahaffy, Gloria Svare and Karen Kopera-Frye, University of Nevada, Reno) that I wanted to go to in the afternoon was really well done and gave me exactly what I wanted to get – some starting point as to how to assemble a means to view curriculum in the Faculty with regards to technology (and everything else… we might as well get both answers at the same time) to see where we have to go. But what about how or what this means to anyone else?

Well in K-12, this manner of thing is done usually with ease as the curriculum process is top down, but in Higher Ed, with the specter of “academic freedom/independence” this is much harder to do, especially in a larger institution. So while this might be completely off the way to do it, this is what my plan is. Many of you might have done this already, others are thinking that you might have to do it as well, especially if there is a new focus that needs to find it’s place in the teaching and learning space of your institution.

The process should start with figuring out what to map and at what level. These “whats” are, or should be objectives/skills and concepts. The higher the level, the more general the detail. So at the Faculty level, there might be a skill of “understand scientific process” and at the course level you might be “refine experimental processes based on the application of research findings”. Each of these items must have some level at which they are assessed. Usually, the Faculty does not and can not to assessment and so this falls to courses. Courses that are specific for certain streams or are requisites for graduation should be identified and their assessments of higher level skills/objectives (eg from the Faculty or department). Assessments should define the type (MC exam, paper, project and identify what it’s various levels of achievement – think rubrics here) Once the objectives and assessments are all identified, they can be mapped in a single unified chart that can show who, what, when and how objects are being met and what, if any overlap or gaps exist. The higher level process should most likely be done at through meetings with Deans and Chairs and the course level should be done with program coordinators for larger courses or in smaller courses, with the instructor.

Working with the instructor, the process should not really have as much talking, but more a spreadsheet/worksheet. Each course should go over and identify how it meets what is defined at the Faculty and department levels. Each course should then add it’s particular goals and objectives and also identify what the pre-requisites and “post-abilities” are related to each course to see how, in addition to the defined objectives, course are connected. The course specific additions might reveal objectives that were not identified at the higher levels (for better or worse). This with the understanding that students really only ever remember stuff long enough to answer questions on the exam, with the exception of a small percentage.

To get this process done, there needs to be a time line and some manner of reward for participation. Ideally, the reward would be that the students are better served, but as many of know, teaching is often secondary to research or other duties for many instructors, and in the end, the thing that talks loudest is money. So if the participation can be shown on an annual review, so much the better. If the instructors are keen on teaching, but are claiming that they should be able to enjoy “academic freedom” while teaching, they can be told that if they participate and get others to do so as well, they will better be able to know what students are coming into the course with and this saves time for everyone and allows for personal interests to be explored with the knowledge that students might even be able to deal with what is being covered and the instructor isn’t just producing noise that has to be cut out with iPods.

In the end the grid would be a table with courses and their respective assessments on the left side in a single column going down and across the top the skills and objectives identified for each level.

Panorama Theme by Themocracy