Better yet, what am I finding out as I plow through the necessary hoops getting my research approved by the Research and Accountability department. I have the survey created for the participants and the screencast tools I need. Creating the appropriate questions to gauge the effectiveness has proved to be the hardest part. It’s also difficult to extract exactly what I’m looking to clarify from the information I’m assuming I’ll receive. First off, I don’t want to make this a comparison to the traditional in-service type professional development. I won’t be delivering a whole concept or introducing a video manual that covers a wide range of technical steps. The screencasts would only replace those situations where a teacher or any staff makes an appointment for me to come in and clarify a part of an application or smartboard, projector, or document camera issue. Making an appointment works fine but may take a few days before I’m available. So I am going to be explicit as to what the screencast are for. They do take some time to make and if I replace every appointment with a screencast, I might not ever leave the computer. I am also becoming aware that not every screencasts are created equal. Some are going to be better than others and it can’t be assumed that everyone in my position is going to feel comfortable creating a screencast. So this week, I’m hoping to have at least five participants who have some tech issues they need clarifying and I’ll start getting data from my surveys. Until then, I’m making sure my delivery system works. I’m going to make a screencast for my mom who has a myriad of tech issues to choose from. I’ll have her complete the surveys and then I’ll hear about how the whole process went. I might have to modify the process from there.