You: When are you finished?
Me: I am never finished (smiles wide)
You: Why don’t you yet have your degree?
Me: Ah…because (smiling wider)
You: Why because?
Me: Because. . . so, how are YOU doing?
You: When are you finished?
Me: I am never finished (smiles wide)
You: Why don’t you yet have your degree?
Me: Ah…because (smiling wider)
You: Why because?
Me: Because. . . so, how are YOU doing?
I’m not sure if this is a trend or something I just noticed. There is some great sharing going on out there–not just a lesson plan here or there, but whole courses worth of stuff.
I first noticed it when I stumbled upon this research methods course (which emphasizes critical thinking–my favorite kind of thinking) https://online225.psych.wisc.edu/
And today, I found this little gem: https://davidlabaree.com/2019/03/04/a-class-in-academic-writing-for-clarity-and-grace/ It is a course in academic writing that uses some of my favorite books as texts (hello, I Say, You Say, the Moves that Matter in Academic Writing) and excerpts from other faves (Bird by Bird, Politics and the English Language, etc.).
The world is full of good stuff, y’all.
I started collecting active learning activities from online resources for a train-the-trainer curriculum, which is part of a state-level grant in PA. The students will be youth and adults. I am using the Designing Courses for Significant Learning (Fink, 2013) as a development framework. I have chosen Accelerated Learning as the instruction strategy, but I have modified it somewhat. I prefer active learning activities, but I don’t plan to be rigid in the preference. The curriculum will be in draft form, and I expect to receive extensive feedback from potential students. With Accelerated Learning, I need these types of learning activities:
These are the activities I have selected thus far:
From U Waterloo Center for Teaching Excellence, which also has small group activity ideas here:
And activities I sometimes use in my own classroom (but cannot attribute to an online source):
I’m still building my activity arsenal. The resources from Vanderbilt, University of MN, and DePaul look interesting, so I’ll continue my search for unique learning activities.
In this post, I want to document two challenges that may be lessened with the usage of an argumentation graphic organizer (such as the one in the link) in setting up a common culminating assignment in the tech comm classroom: the technical presentation.
I am still struggling to get students fully on board through the semester-long portfolio assignment. The portfolio is a virtual location in which to document and store pre-writing assignments–both stand-alone personal discovery assignments and scaffolded assignments that culminate in the technical presentation. I believe issues with the layout and setup of the portfolio could be contributing to the less-than-seamless feel of the portfolio, but perhaps the usage of graphic organizers (such as the one in the link above) and other structured exercises stored in the portfolio could help students see it more as a virtual workspace.
This challenge is specific to technical communication pedagogy: students often struggle to understand technical and scientific writing as inherently persuasive (or humanist–Miller, 1979). When defining audience and purpose for the technical presentation, I sometimes give in and relax the all-rhetoric-all-the-time conversation to allow students to choose between “to persuade” and “to inform” as the purpose for their technical presentation. Then I try to nudge them in the direction of discovering informing as a means of persuading. My ultimate goal, however, is for students to understand the rhetorical nature of all communication.
I follow some of the pedagogy in The Writing Revolution, which advocates embedding writing into the content part of a curriculum. It may be a bit odd to embed writing into, well, writing–especially since the learning outcomes of a tech comm course often specify some version of “use writing as a tool to do work”. In other words, students are expected to use the skills they learned in K-12 schooling and Composition 101 to solve real-world problems. However, writing is a skill best practiced continually; as such, writing practice is an inherent part of tech comm pedagogy. Embedding writing practice into tech comm pedagogy supports what I have long felt is necessary in teaching this subject–no throw-away assignments. In other words, all the assignments in the course should (ideally) either benefit the student directly through the development of meta-cognition (writing is thinking) or apply to a scaffold that culminates in a final product (an analogous embedding strategy can also be seen in the Critical Thinking Initiative, which I highly recommend).
Is a graphic organizer age appropriate for adult students? Indeed, the example in the link above is presented at the ninth grade level, and graphic organizers are often associated with younger students. On the other hand, it is important to note that much of the note-taking research done with graphic organizers and adult students does suggest significant improvement in recall outcomes associated with their usage (cognitive load theory and other neuro-related theories are often cited as possible explanations). Since I have taught primarily online, I have not used graphic organizers. I believe that a graphic organizer need not be pedantic nor ninth-gradeish, so it might be helpful for development of argumentative writing.
Hi all,
I will be presenting with Dr. Joan Kester at the DCDT Conference in Cedar Rapids, IA about digital tools for students. Below, please find links to some digital tools that we find interesting and promising for students with disabilities of transition age.
This is a new and promising area of digital tool management.
These apps may be helpful for students who struggle with frustration management because many teens have their phone with them at all times. They can use these apps without anyone knowing because it looks like they are simply checking their phone.
The purpose of this study was to explore digital agility and digital decision-making for students with disabilities in the context of a specific project—LEXDIS. Participants were 31 students in higher education in a school in the United Kingdom. The students were younger than 20 years old. The study included 17 female and 14 male students. This was a participatory framework where individuals with disabilities were involved as consultants or designers (or they chose to disseminate the final work).
The research design consisted of three phases — online survey, interview, and focus group. The intervention was learning about, using, and participating in the design of LEXDIS. Outcomes were coded and mapped against a framework of digital inclusion: resources and digital decisions, which were categorized as technology, personal, and context (social). The framework was designed to capture digital inclusion beyond accessibility and knowing how to do things with digital tools. In other words, it was designed to capture the complex, multi-layered nature of digital inclusion.
Technological = physical and material resources
Personal = human or mental resources
Contextual = temporal, social, or cultural resources
It was revealed that all students customised their computer digital devices (icons, colors, etc.), Most students owned a phone and laptop. Most students used instant messaging, discussion forums, social networking sites, and uploaded videos or photographs onto the Internet. All students Google or other search engine to access information and had used online learning materials. They also used word processing programs (google docs), spreadsheets, and email.
Students described many strategies in using digital tools. They expressed a high level of confidence in their usage.
Factors that influenced student usage of technologies: technological factors (affordances), personal factors (feeling stigmatized when using assistive technologies in public). Some students reported that they did not use social networking because it takes them “twice as long as everyone else to do it” (speaks to perceived value)
Digital agility of students was identified in the study. Researchers encourage educators to avoid seeing students with disabilities as victims of exclusion. They support an empowerment model.
Pasteur’s quadrant refers to a section of the Quadrant Model of Scientific Research introduced by Donald E. Stokes (1997). The model features the work and work-habits of 3 inventor/researchers: Bohr, Pasteur, and Edison.
Just to give you some background information, we are all aware of a tension between teachers and researchers. Even if you do not notice it in your own work, you might see it in other classrooms or with other researchers.
In this paper, the researchers explore the tension between teachers and researchers through the lens of discourse theory. Discourse theory is studied widely in communication because it recognizes the fact that “language alone cannot account for meaning” in communication. Discourse theory takes into account the discourse community, which describes a group of folks who use similar language to communicate. The researchers describe a discourse community as an “annointed guardian of the truth,” and thus, the language they use to describe and discuss knowledge and knowledge production–words, emphases, syntax, etc.–define them as a part of that discourse community. We use discourse communities to understand how people belong together. We are trying to understand who to include; however, when we include many people, we also exclude others.
Among the discourse communities of teachers and researchers, there is a sense that teacher knowledge/wisdom is a separate and lesser category of knowledge. This can create an indignation against researchers for excluding teachers from their discourse community. It may even give rise to a “resistance culture” wherein knowledge production is limited or even halted by the actions of members of both discourse communities. In these situations, it is important to recognize–it’s not that teachers don’t like research or evidence-based practices (EBPs), they simply value a different approach. Teachers tend to value practice-based evidence (PBE), which is relevant and externally valid while researchers value EBPs, which are rigorous and internally valid. Because discourse communities are closely linked to identity and trust, it is unlikely that logical appeals for everyone to “just get along” will be enough to integrate EBP and PBE.
Smith et. al say we don’t have to choose. They present the Stokes (1997) model, which introduces research as a synergystic model using a both/and approach rather than a either/or approach. In the model, we have Bohr (developed the modern model of the atom) as a quest for knowledge without consideration of usage. Edison (who invented lots of things) represents the view that deeper scientific knowledge is secondary to development and application of useful products. Pasteur brings these together with scientific efficacy and real-world effectiveness. In the model, practice and research are complimentary. This is a use-inspired basic research model.
So, how is this done? How can we implement this model? The researchers suggest we use Education Design Research (EDR) and implement it through Communities of Practice. This is not suggested as an actual methodology by the authors, but a “framework for a range of methodologies.”
External validity: the validity of generalized (causal) inferences in scientific research, usually based on experiments as experimental validity. In other words, it is the extent to which the results of a study can be generalized to other situations and to other people.
Internal validity: how well an experiment is done, especially whether it avoids confounding (more than one possible independent variable [cause] acting at the same time). The less chance for confounding in a study, the higher is its internal validity.
Education Design Research: developed by Brown (1992) and Collins (1992). Suggests that “rigorous education research should take place in complex educational environments.”
Communities of Practice: groups of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly (Wenger-Traynor, n.d.).
Characteristics of EDR/CoP
Stages of EDR
Implement, validate, and scale up
Starting large research projects can be cumbersome. A more agile approach might be to become more sensitive and react to contextual factors.
Scaling up is a process by which interventions are implemented small-scale, validated, and then implemented on a larger scale. IES defined a need for understanding the organizational conditions needed to support an intervention and determine the effects of selected moderators of the intervention. Even schools that implement successfully struggle with sustainability due to competing priorities, changing demands, and teacher/staff turnover.
Dunlap (2009) + Coburn (2003) = emergence, demonstration of capacity, elaboration, system adoption and sustainability.
Persistence after funding is low, even when the results are good. Cultural changes are required and local contextual features of an educational system must be acknowledges and managed. Other struggles occurred when scaling up was the domain of education policymakers, not researchers/teachers/administrators (but of course, right?) Funding agencies have also been challenging because they did not account for local complexities. IES funding structures typically wanted a “standard” implementation. They did not want to implement in an ad-hoc manner. From my experience in software development and implementation, I’d completely agree with that approach. Ad-hoc implementations lose the power of economies of scale and become troublesome/expensive to maintain. In software, we required business units to adapt to us, while education is a completely different deal.
Implementation science may be the missing link between standardized practices and successful implementations. Implementation science addresses adoption decisions, capacity building, training, technical assistance, consumer participation and satisfaction.
*under what conditions and with whom does an ebp work
*why is it necessary to support teacher implementation of an ebp
*what is necessary to increase the capacity of districts under different ecological and population differences
*what is necessary to support deep, broad, sustainable implementation of the ebp
Examples
*Classwide Peer Tutoring (CWPT)
*Peer Assisted Learning Strategies (PALS)
*School Wide Positive Behavior Supports (SWPBS)
Success factors
*Maximizing contextual fit betw ebp and educational environment
*Promoting ebp as a priority
*Ensure fidelity of implementation
*Increase efficiency by integrating ebp into daily school operations
*Use data to make ongoing decisions about the ebp
It is important to strike a balance between implementation fidelity and teacher flexibility
Factors that can support it: PD and district leadership
When we fail to scale-up interventions, the result is a gap between research and practice.
*Lack of trust
*Misunderstandings
*Not valuing the input of all stakeholders equally
*Different beliefs and philosophies
*Tendency to dismiss evidence that does not support our pre-existing views
*sometimes a practice is overstated in effectiveness or generalizability
PD on scale-up is highly recommended
Evidence-Based Practices and Implementation Science in Special Education
The value of EBPs is limited by the quality, reach, and sustainability of implementation practices. The gap between research and practice in sped is persistent and confounding to all who care about children’s futures. Some combination of EBPs and Implementation Science may help bridge the gap at some point. The challenge to discover the “secret sauce” is ongoing.
The crux of this article (in the context of 8304) is its recommendation to use an Implementation Science Framework to increase inclusion of pre-school students with disabilities in classrooms with typically-developing children. Implementation Science explores how a particular evidence-based policy can be successfully implemented in an educational system. It suggests there are particular leadership and organization supports that increase the chances of facilitating lasting change in educational systems. For example, the following practices were recommended: creating work groups to focus on identifying local policy barriers to inclusion, appointing community leaders to address attitude and belief challenges in the local population, and enlisting state directors of special education in establishing short and long term goals related to inclusion.
The implementation science angle reminds me of the work I did in a course on UDL. The final project in the course was developing a system-wide plan for change and support of implementation of UDL principles. It seemed widely recognized that one educator, one principal, even one school board member could not make the change to using UDL as an educational foundation on their own. Instead, the entire system must be revamped to reflect the iterative and change-oriented atmosphere to support its implementation. In addition, the system of implementation recommended in the course was to use UDL to implement UDL How meta, yes? This was an interesting feature, which perhaps does not apply to all implementations of change. Even so, the most important part of the final project was engaging all parts of a system in the change (using UDL to implement UDL). Most importantly, continuous PD was emphasized as crucial to sustaining interest and enthusiasm. I wonder if this method worked because of the “using UDL” part or the “systemic engagement” part (as an instantiation of a successful Implementation Science approach).