Looking Past the Other in Digital Communication

*

When reading articles via the internet, it’s probably a good idea to just steer clear of the comments section. This is especially true when reading something that is related to issues of equity or accessibility for all. Trolling in the form of racist, sexist, and other fear-fueled rants can seem like the dominant mode of communication of many participants in this space. It can seem as if individuals are talking over and past one another, and communication is not founded on true dialogue.

Dialogue, Paulo Freire asserts, is an “existential necessity” that is inherently a vital part of learning (1968, p. 89). The act of participating in dialogue is an indicator of humility and the willingness to learn. It can provide participants the opportunity to recognize one another, the essential elect of identity development and respect. The willingness to think outside bias, to critically examine our biases, is at the heart of learning about the other and ourselves. It is our responsibility to one another (Buber, 1937). This may seem frightening, because it essentially places us in the unknown, the uncomfortable place of not being able too easily categorize and identify others. This identification makes life very simple. However, being uncomfortable is the only way we truly grow. The old saying reminds us that moss does not grow on a rolling stone. Stasis equates to a stillness that is not unlike death.

St. Johns River, Jacksonville, 2017


 

Margaret Wheatley expresses the significance of being uncomfortable: “We can’t be creative if we refuse to be confused. Change always starts with confusion; cherished interpretations must dissolve to make way for the new. Of course it’s scary to give up what we know, but the abyss is where newness lives. Great ideas and inventions miraculously appear in the space of not knowing. If we can move through the fear and enter the abyss, we are rewarded greatly. We rediscover we’re creative” (2002, p. 37). This discovery can help give meaning to our lives and enrich those with whom we interact. Basically, we learn more about ourselves through others. It sounds simple…and it is…if we are ready to be uncomfortable.


MIT Press, 2017


 

Byung-Chul Han’s most recent English translations, The Agony of Eros (2017a) and In the Swarm (2017b), both discuss the absolute need for our encounter with the other. He warns that the digital medium of expression “is taking us farther and farther away from the other” (2017b, p. 24). Our ability or inability to articulate ourselves is exacerbated in the digital medium, and “nonverbal forms of expression such as gestures, facial expressions, and body language” are lost almost completely (2017b, p. 21). Our inability to plan for this learning leaves us with no “other” with which we may view new perspectives and understandings of the world.The visual images are constructed for us to see ourselves (or our closest analogue), thereby making everything the same. This massive normalization ends the need for an other, and it destroys the possibility for imagination or fantasy (2017a). We must be able to perceive through another viewpoint, one that is truly the opposite of the one we hold, so that we may engage in thinking that is infinitely more complex.

Without confrontation with the other, we are doomed to live empty lives, lonely and incomplete. There is a small piece of a recent poem by Joshua Marie Wilkinson (that is part of his series of poems that begin with a line from Osip Mandelstam, The Easements) that reads:

“as I’ve found in the stars

no friend, the lake

no brother, the current

no story to live with.”

 

I don’t know why my thinking takes this path, but it reminds me of the other as being the source of desire, the source of a true narrative. Without the necessary encounter with the other that produces co-constructed knowledge for the benefit of both participants, our individual life stories are stillborn (Han 2017b).

Perhaps comments and social media posts are not really avenues for actual discussion. If that’s the case, I don’t understand the necessity of providing a vehicle for reader voice if it isn’t to inspire dialogue.

Embrasures at Fort Clinch, Fernandina Beach, Florida (2017)

 


 

Note- The article that prompted this brief line of though is located here. The comments section yielded some replies that were blatantly racist and sadly myopic.

 


Brief Reflective Notes on the Leadership of E-Learning, Technology and Creative Services

*

Although, this department (ETC) has been a fixture of UF’s College of Education for a number of years, this year has been a year of optimization of services.  Throughout the past year, our department has coalesced into a very agile and forward-thinking group composed of five distinct sub-teams. These teams, usually not found clustered in one department, all work intimately to help our faculty to reinvent online education practice, implement new ways of teaching and learning; build engagement and support for alumni, current, and future students; create web designs that leverage learning, usability, and aesthetic design; and, support the building of collective efficacy and collaboration through internal marketing and awareness. The main pursuit of this office is to become leaders of instructional design for the university and the field of higher education.

Instructional Design for Online Learning in Higher Education

It is generally acknowledged that online educational experiences offered by most institutions of higher education do not reflect identified high-yield learning strategies (e.g., Hattie, 2009; Marzano, 2009), specific strategies (including frequent and specific feedback) for the online environment (Mandernach & Garrett, 2014; Mayer, 2015), or the teacher presence (Ragan, 2015) found in their analogous face-to-face counterparts (Berrett, 2016). A recent national survey conducted by the Association of American Colleges and Universities (2016) suggests the problem may rest in multiple areas, including the preparation of faculty and staff to create pedagogically sound digital learning opportunities. The report suggests, “high-impact educational practices are offered by many institutions, but rarely required.” Additionally, the findings indicate that approximately 36% of Chief Academic Officers report that “most of their current faculty members are using digital learning tools effectively in their courses.” This seems to ring true. Incidentally, the UF College of Education (CoE) has earned its first #1 ranking from U.S. News and World Report for our online graduate programs during my tenure. However, “faculty credentials and training” was still cited as an area of need in the scores that make up this ranking.

Our College of Education and its faculty have the greatest experience on campus in planning and implementing innovative pedagogical change in any context, including but not limited to online learning. This wellspring of expertise must inform future University of Florida endeavors in online education. Recent work in the area of multidisciplinary approaches to teacher preparation being offered online include the newly formed Center for Elementary Excellence in Teacher Preparation, the cross-department institution of teacher observation and mentoring through synchronous and annotated video solutions, cutting-edge research agenda (including the exploration of cognitive and social neuroscience methodologies and technologies) of Educational Technology faculty, video-based research conducted in SESPECS, and the digital outreach efforts to communities of learners led by our CoE-based centers. It is imperative that the teaching and learning research ecosystem fostered here at the College of Education is leveraged in support of the growing need for expanded online degree offerings and highly individualized learning environments.

Brief Notes re: Strategies in Redesigning ETC in 2015-2016

Communication and Collaboration

One of the main goals for this past year for ETC has been investing in relationships, connecting departments doing similar or complementary work, and supporting the improvement of all online activities. The first collaborations included the analysis and restructuring of hardware (servers) and the gap analysis of current websites. This massive undertaking (three months) was a change that could happen through collaboration with IT and wouldn’t necessarily impact the ETC staff directly. In effect, this change, and the rebuilding of the relationship between the two offices, allowed the instructional and cultural changes to happen more gradually. This direction allowed for the planning of slower change of “behaviors of people” in our department over time (Deutschman, 2005).

Relationships with key stakeholders of faculty, specifically department chairs, were revisited with renewed vigor and transparency. I led this charge, supported by our administration and instructional design. Additionally, the web design team leader assisted with the “soft sell” of our services, creating digital “profiles” for key department areas.

Employing Research-based Attributes of Highly Effective Online Learning

Our team has led the way for the implementation of attributes associated with effective online learning, backed by the understanding that designing online educational experiences founded in learner motivation and interest rely on shared contextual learning activities that promote the use of technology in service of creating authentic online collaboration and interaction (Sawyer, 2016) while supporting a personalized learning approach (U.S. Office of Educational Technology, 2016).

Specifically, we have worked to offer online learning opportunities that promote explicit articulation of student outcomes, the integration of assessments (formative and summative), learning designs promoting self-directed and collaborative learning, and implementing professional development strategies that assist faculty in embracing and utilizing technology effectively for teaching and learning (U.S. Office of Educational Technology, 2016).

Learning Asset Production and Digital Asset Management

Early on in the transition, it was agreed that investment in high-quality videography and other learning material design was a priority in enhancing and/or redesigning the existing online courses, and we created a mobile video unit and a small studio. Furthermore, the investment in these resources would help other areas of the College of Education, including the Office for Alumni Affairs and News and Communications. The video and photography digital learning assets produced support three main areas of work:

  1. Research-based video observation for learning (e.g., teacher video self-reflection or leader preparation in observation practice to inform instructional improvement). This focus is supported by recent research in video-based teacher observation for reflection on practice (e.g., Gates Foundation, 2010; Stigler et al., 1999), in teacher preparation and professional development support (Guaden & Chalies, 2015), and evaluation (e.g., Kane, Wooten, Taylor & Tyler, 2011).
  2. Classroom video examples, lectures, and expert interviews as digital pedagogical support. The literature informing this work includes the measurement of student engagement in video-rich MOOCs (e.g., Guo, Kim, & Rubin, 2014), the examination of the impact of case-based video assets for instructional design (e.g., Gomez, Zottman, Fischer, & Schrader, 2010), and the review of the impact of teaching video used in professional development courses (Borko, Koellner, Jacobs, & Seago, 2011).
  3. Marketing and awareness video for external stakeholders of the College of Education (including alumni, partners, and legislators).

In addition to video, which has increasingly become vital to our work, our department has employed instructional, user-centric design principals to everything from websites to paper-based marketing material for programs. The team has built and maintains a digital asset management (DAM) system with photography and video archives that may be accessed by media and communications personnel throughout the college. Illustration, animation, graphic design, and branding were all employed to assist in redesigning the aesthetic look of courses, websites, ideas (e.g., STEM Hub and logic models for grant applications), and physical space (e.g., banners, posters).

Cultural Change

In the effort to improve the culture of the College of Education’s Office for E-Learning, Technology, and Creative Services (ETC), we have explicitly engaged in an initiative that has motivated the internal stakeholders of our office to revisit our commitment to improving and supporting online and hybrid instruction for all degree and certification programs. I have worked closely with each of our internal teams (instructional design, web design, creative media production, systems administration, and student support services) to establish attainable but rigorous goals and have provided opportunities to build processes to achieve their goals. We planned a retreat to revisit and explore our identity and better understand our mission, to interrogate our shared beliefs and values as a group, and to plan strategies to build and strengthen relationships across our college and the university (Wheatley, 2005).

Mark Dinsmore (Associate Director for Enterprise Systems) and I targeted staff to take on informal and unofficial but recognized leadership roles, mentoring and reinforcing goals and objectives daily within small groups. We instituted a weekly department huddle with a focus on shared “project-based” discussion. We also created an “on boarding” series of strategic meetings for all new programs and those being redesigned, including every facet of the department. This continuous project/program-based improvement model in group meetings and individual mentoring allowed all teams to engage in discussions.

Implementing Uniformity in Processes of Support and Production

All sub-departments of ETC have been assisted in documenting and codifying processes for production of digital learning assets, courses, websites, reports, etc. This work has been difficult but has provided uniformity to the stages of design and delivery of learning experiences for all courses and programs. Our team has worked to become cohesive and build on strengths associated with assisting faculty, students, and the College of Education.

Some Foci of the Department in 2015-2016

  • Creating innovative CoE course content production that includes video, photography, graphics, animation, software, etc.
  • Designing or optimizing online pedagogy, supported on researched best practices.
  • Refining of data analysis for strategic support for all departments.
  • Supporting faculty innovations, research, and outreach/communications.
  • Supporting student recruitment, alumni and student engagement, and success through effective web strategy (social media, web redesigns, graphic design, illustration, etc.) and student services.
  • Hosting and supporting infrastructure of products as diverse as web applications to large databases used in research or in testing.

 

References

Association of American Colleges and Universities. (2016). Recent trends in general education design, learning outcomes, and teaching approaches. Retieved on April 1, 2016 from http://www.aacu.org.

Bill & Melinda Gates Foundation. (2010). Measures of effective teaching (MET) project–Working with teachers to develop fair and reliable measures of effective teaching. Retrieved on December, 7, 2012  from http://metproject.org/downloads/met-framing-paper.pdf

Berrett, D. (2016). Instructional design: Demand grows for a new breed of academic. The Chronicle of Higher Education, March 4, 2016.

Borko, H., Koellner, K., Jacobs, J., & Seago, N. (2011). Using video representations of teaching in practice-based professional development programs. ZDM Mathematics Education, 43, 175-187.

Derry , S., Pea , R., Barron , B., Engle , R., Erickson , F., Goldman , R., Hall , R., Koschmann, T., Lemke , J., Sherin , M., & Sherin , B. (2010). Conducting video research in the learning sciences: Guidance on selection, analysis, technology, and ethics. Journal of the Learning Sciences , 19, 1–51.

Derry , S., Sherin , M., & Sherin , B. (2014). Multimedia learning with video. In R. Mayer (Ed.), Cambridge handbook of multimedia learning (pp. 785–812). New York: Cambridge University Press.

Deutschman, A. (2005). Change or die. Fast Company, 94, 53-57.

Fullan, M. (2009). Turnaround leadership for higher education. Jossey-Bass: San Francisco, CA.

Goeze, A., Zottman, J. Schrader, J. & Fischer, F. (2010). Instructional support for case-based learning with digital videos: Fostering pre-service teachers’ acquisition of the competency to diagnose pedagogical situations. In D. Gibson & B. Dodge (Eds.), Proceedings of the Society for Information Technology and Teacher Education (SITE) International Conference 2010 (pp. 1098-1104). Chesapeake, VA: AACE.

Hattie, J. (2009). Visible-learning: A synthesis of over 800 meta-analyses relating to achievement. New York: Routledge.

Kane , T. J., Wooten , A. L., Taylor , E. S., & Tyler , J. H. (2011). Evaluating teacher effectiveness in Cincinnati public schools. EducationNext, 11(3).

Mandernach, B. J. & Garrett, J. (2014). Efficient and effective feedback in the online classroom. Magna Publications White Paper. Retrieved on March 28, 2016 from             http://www.magnapubs.com/white-papers

Marzano, R. J. (2009). Setting the record straight on “high-yield” strategies. Phi Delta Kappan, 91(1) 30-37.

Mayer, R. E. (2015). The Cambridge handbook of multimedia learning (2nd Edition). Cambridge University Press: New York, NY.

Ragan, L. (2012). Creating a Sense of Instructor Presence in the Online Classroom, Online Classroom, 12(10), 1-3.

Sawyer, K. (2016). The Cambridge handbook of the learning sciences (2nd ed.) Cambridge University Press: New York, NY.

U. S. Office of Educational Technology. (2016). Characteristics of future ready leadership: A research synthesis. Retrieved on April 2, 2016 from http://tech.ed.gov/leaders/research/

Wheatley, M. (2005). Finding our way: Leadership for an uncertain time. Barrett-Koehler: San Francisco, CA.

Q Methodology: A Brief Background and Sample Pilot Study with School Principals (Student Paper Draft, 2007)

*

No copy-editing has occured to provide this post some clarity. APA is almost ignored. But, the general curiosity remains. I have always been interested in identity and how we define ourselves throughout our lives and careers. The following was a brief paper and pilot “study” completed with a group of principals in 2006.

Very Brief Background

Q-factor analysis originated soon after Charles Spearman invented factor analysis at the start of the twentieth-century. Factor analysis, according to Steven R. Brown (1980), has been historically “used as a procedure for studying traits”. In this role, factor analysis has been popularized by social and political science. However, Brown explains that factor analysis can be used to factor persons, thereby creating what William Stephenson (1953) terms “person-prototypes”. This, Brown asserts, would require a separate methodology. This methodology, entitled Q, is described by Hair (1998) as “a method of combining or condensing large numbers of people into distinctly different groups within a larger population.”

Although G.H. Thompson was the first researcher to work with Q-factor analysis, he was not positive about its future (Brown, 1980). He believed that it had serious deficiencies, which I will discuss further in a moment. However, one researcher named William Stephenson was more excited about the possibilities of Q. Since its discovery, Q-factor analysis has been used widely in the social and behavioral sciences.

The main structural difference between Q and R analysis is summed up by Raymond Cattell’s description of the “data box” (1988). Cattell names three main components of a factor analysis: persons or cases, items, and occasions. He said that how we organize these components would structurally change the procedure. For example, in R-factor analysis the items signify columns on a matrix, while the persons completing the items represent rows. In this picture, one can see that the items would be grouped to create less factors, thereby creating types of items. Inversely, in Q-factor analysis, one can place the persons in the columns and the items in the rows. This process would create person-prototypes as previously mentioned.

The person-prototype idea is one that has revolutionized the social sciences. Researchers are able to make a case for a certain person type linked to various areas of behavioral disorders. One such study, conducted by Porcerelli, Cogan and Hibbard (2004), was created to better understand what personality traits men possessed who were violent towards their partners. The Q-sort was very large, 200 items long, and was completed by several psychologists and social workers very familiar with the many cases of domestic abuse. The end result supported the notion that these men were “antisocial and emotionally dysregulated.” Thus, it may be argued that these violent men have some things in common that make them stand out from others, person-prototypes.

Q-methodology has been employed by other fields of inquiry recently, as well. In Woosley, Hyman and Graunke’s work with student affairs problems on college campuses using a population of only three, the researchers wanted to explore whether Q would be a promising evaluation tool for the student experience (2004). They found, when asking these participants to sort ideas concerning their jobs on campus, that the students were excited about the process. During a post-sort interview, they all expressed enthusiasm for the activity and the results.

Controversy

Even with positive stories of Q like these, there are a few reasons why some researchers refuse to use this methodology or see any potential for its use. For example, one could easily discern from the discussion of the data box that a researcher could just take a set of data gathered for an R-factor analysis and apply it to the Q-structure, thereby completing another full analysis of the same information. Cyril Burt championed this form of usage in the Thirties (Stephenson, 1953). This is one point of contention for Stephenson. Stephenson explained that the procedure for collecting the data was part of the methodology. He stated that the Q-sort, the activity of participants physically sorting items in a prescribed pattern under certain conditions, was part of the overall methodology. One could not collect the data for the specific purpose of running an R-factor analysis and simply rearrange the data in a way appropriate for Q-analysis.

Many researchers disregard Q-factor analysis due to its lack of generalizability. They may claim that such a small sample could never be applied to a much larger population. In this respect, they may be correct. A Q-analysis is meant to really be something like a case study. It may be applied in some fashion to another situation, but the data collection is of a moment in time, or an occasion.

One of the main reservations I have with the Q-methodology is the focus on researcher-designated language. The language or items that are selected for the sort are done so by the researcher, not the participants. Thus, there may be some error in communication.

Sample Q-Sort Methodology

The particular focus of my sample Q-sort was a group of principals that are currently participating in the North East Florida Educational Consortium Principal Leadership Academy (PLA). The academy is only a year old, and the current version is a pilot run of the program that has been designed for principals who are undergoing some preliminary training to facilitate a school-wide action research project. The academy is comprised of twenty-four participants, principals with little experience or early-career principals to principals with a great deal of experience or seated principals. Because the leadership experience was quite varied among the group members, my hypothesis was these principals could be arranged in groups by experience and/or leadership style.

The items I decided to use in the Q-sort were the behaviors that the state recommended to the districts might be associated with the ten newly-adopted Florida Principal Leadership Standards (April, 2005). Of course, these behaviors were all optimal based on the standards. Thus, if a sort was using these written behaviors, there would be no “wrong” answers. This was important in establishing trust amongst the participants and me. This was no competition or evaluation of how they relate to and sort these behaviors. If they were aware at the outset, that there was no “correct” way to sort these items and there was no evaluative component to the sort, they may be more honest in the sorting process.

Another possible dimension that could be added to the sort that would possibly yield richer results would be the grouping of leadership behaviors into two categories, transactional leadership behaviors and transformational leadership behaviors. Due to the fact that these behaviors were never verified to actually represent either form of leadership, the Q-sort would have to be labeled an unstructured sort. (Appendix A).

After deciding which behaviors I would use as my items (16 sentence strips), I turned my attention to the actual Q-sort process.

Consulting Fred Kerlinger’s Foundations of Behavioral Research (1973), I was able to formulate a methodological plan. Kerlinger clearly maps out the process of setting up a practice Q-sort activity, or what he calls a “miniature Q-sort”. He writes that the participants may only sort a few items, as little as ten. This would not be optimal, he goes on to explain. Kerlinger insists the more items that one has available for the participants to sort, the better the results. Another piece of useful information was the discussion of the sort design. Kerlinger describes the physical act of sorting the items. He sets up a wonderful method of manipulating the items into a quasi-normal distribution, a Likert-type scale (with seven points) where the participant may choose whether the item is most like them or least like them. The participants are limited with the amount of items they can place at a given point. With this method of distribution, the sort resembles a normal curve. I used this example to help plan the sort activity with the principals in the PLA.

With the example below, the top line is the number of items that may be placed at each point on the Likert-type scale, and the bottom line is the scale itself. In this example, 7= items most like me, and 1= items least like me.

1 2 3 4 3 2 1
_________________

7 6 5 4 3 2 1

The sixteen items could be sorted in this quasi-distribution very easily by the participants. Each behavior strip contained a number, so that the participants could easily record the placement on the data sheet provided (Appendix B). I created ten identical envelopes containing the sixteen principal behavior strips. Then, I created a large poster displaying the procedures of the sort and the limitations for each placement. I would let the principals sort the behaviors after an already scheduled PLA meeting. They would be separated, mostly for the purpose of providing space for each participant.
On November 2, 2005, the participants completed the sort and carefully filled out the corresponding data sheet as I monitored. The purpose for monitoring was the successful completion of the stated procedures. This was explained to the participants. Once I collected all of the data, I began entering into SPSS 13 to start the analysis. The SPSS software is really set up with R factor analysis in mind, the columns are used for mostly organizing items (refer back to the discussion of Cattell’s idea of the Data Box). However, it is important to note that one may enter the participants in the columns as nominal data. Then, one could easily enter the numbers of the sorted behaviors as the rows. Then, the factor analysis procedure is the same from this point on.

Analysis and Interpretation

In this discussion of the results of this particular practice Q-analysis, I will be addressing the interpretation of the results yielded in Q-analysis in general. I will also be referring to Figures 1-7, yielded by the SPSS software during this practice analysis.

Figure 1 displays the correlations between the individuals based on how they sorted the leadership behaviors. We started with ten factors (individuals), and we are given ten separate factors in this table. This matrix enables the researcher to make some general statements about how each participant correlated with another. Remember, a 1.0 is a perfect correlation, so those are usually the person correlated with themselves. If one consults Hair’s opinion on the cut-off point for correlations, the cut-off for looking at correlations is anything under .450. This makes sense, because the researcher is really looking for correlations that are nearer to 1.0, as stated above. For example, one can see that there exists a strong correlation between LF and LB (.625). Thus, we could state that they may have sorted somewhat similarly. Inversely, M is not correlated to LB very well at all (.050), leaving us to assume that these two individuals may have sorted the behaviors very differently. However, this is all that we can ascertain at this point.

Table 1.

1

In Figure 2, the researcher is focused on communalities, or how much of the original participant/factor was extracted/recreated in the analysis. The glaring observation that should be seen at the fore is N (.557) shows the least in common with the group as a whole. At this point, it would help the reader to know that N was the only non-principal participant in the sort. I offered her the chance to take part in the sort to have an even number of ten participants in the activity. N has never worked in an educational administration position.

Table 2.

Screen Shot 2014-07-24 at 11.22.24 AM

Extraction Method: Principal Component Analysis.

The next step in the analysis is to look at the Eigenvalues and percent of variance that may be explained by the factor analysis (Table 3). When the data was entered, I wanted to isolate the factors that had Eigenvalues of 1.0 or greater. This would present four factors or, in the case of the SPSS output, components. These factors/components are the “person proto-types” discussed earlier. With this table, one can discern that almost 81% of the variance is explained in the analysis. This is very positive for two reasons: first, I now can see that four factors or person proto-types was a good number to represent most of the variance and, secondly, the low number of four factors is a good reduction from the original ten.

Table 3.

3

A scree plot of the factors will confirm that four factors/components is a good representation of the whole. To read the scree plot in Figure 1, one must look for the area at which the downward motion of the line comes to a plateau or a leveling off. It is apparent to me that the original interpretation of the number of factors was a wise decision. The plateau of the scree appears after the fourth factor. One may argue that this point actually does not level off as much as it jets upward slightly. However, being aware that this point represents the odd-man out, N (the participant with no experience in an educational leadership role), I believe that four factors truly does represent the whole in the best way. After reviewing this output originally, I ran the analysis again isolating only three factors. However, there ended up being a few of the original participants/factors left out of the whole. Thus, I opted for the four-factor analysis model.

Figure 1.

4

In the next step of the analysis, the researcher begins to examine the extent to which each original factor is represented by the four composite factors or proto-types. The first matrix (Table 4) shows the extent to which each of the original components is represented by the four factors created before the rotation and the variance is distributed more evenly amongst the factors. In other words, we can see which of the person-prototypes each individual fits in the best. For example, M is definitely more associated with the first extracted factor. With this matrix, we can only begin to see how the participants might relate to the person proto-types created. To gain a clearer picture of the relationship between the participants and the composite factors, one needs to consult the rotated component matrix.

Table 4.

5

In Table 5, the output from the rotation (using the varimax criterion) is more accurate in describing how the well the components is represented by each of the factors. With the background knowledge of all the participants, I could easily see justification for each of the participant’s placement in the matrix. I set the analysis in SPSS to create an output that would arrange according to size. Thus, looking at the matrix, the researcher can see the participants that share the most in common grouped together. In the first column, the first three participants are strongly correlated at .917, .868, and .659 respectively. It is interesting to note that these three principals represented by the first factor are the three most experienced of the participants. Additionally, these three administrators started a statewide reading reform together, meeting monthly for the last five years to discuss and share ideas with reference to the reform.

Table 5.

6

In the next factor, F (-.915) and K (.904) are represented. F, it appears, is very negatively correlated to K. K has been a principal for one year and has worked as an assistant principal to one of the participants represented in the first factor. F was the principal of a failing school last year, and now is a new principal at a K-22 special needs school. They actually appear to have sorted the behavior strips almost the exact opposite of each other.

The third factor has LB (.928), LF (.716), and N (.714) correlated to each other. LB and LF are both first year principals. N is the non-principal among the group as stated previously. It makes sense to me that they may have sorted the behaviors similarly.

The final factor includes BA (.790) and R (.560). R does not appear to correlate highly with any of the four factors. This grouping is the only one that seems to contain two individuals that have very little in common in their backgrounds. When I forced the analysis to create only three factors/components, BA was left out of the final grouping of factors.

Before leaving the analysis, it is important to speak to one last piece, the variables/behavior strips and their value to each of the factors (Table 6). These values are in the form of Z-scores, making it easier to see how each person- prototype sorted each behavior (interpreted by columns) and how each behavior strip was comparatively sorted in each person- prototype (interpreting by rows). Daniel (1990) explains that these values or standardized regression factor scores are “utilized to determine which items contributed to the emergence of each of the person factors.” Remembering that the first eight strips were designated as transformational leadership behaviors and the second eight were designated as transactional leadership behaviors, one can now find some patterns in how the prototypes sorted. It could be argued that the first group organized the transformational leadership behaviors as more like them than the transactional leadership behaviors. For example, behavior strips # 1, 2, and 4 all score very highly in the matrix for the first group. In contrast, the third group scored behavior #1 negatively, less like them. However, the third group also scored transformational behavior strips # 2 and 4 highly.

Table 6.

Screen Shot 2014-07-24 at 11.23.16 AM

Taking into account that the scores did not really follow a trend that established any of the groups as definitively transformational or transactional, it is probably safe to say that the participants were grouped according to another criterion. We can say that the participants were grouped with others who sorted a set of behaviors similarly on that day at that time.

There are two facets of the Q-sort and analysis that I would change if I were to conduct a similar study in the future. To begin, it would be more structurally sound to use many more behaviors in the sort. The added information that they could rate may yield different results in the analysis. Additionally, I would not group the Florida Principal Leadership Behaviors into the two leadership styles, transformational and transactional. This created an unstructured sort, or one based on items that were not used previously in this manner. There has been no research linking these particular items with the labels transformational and transactional.

This practice Q-sort and analysis is narrow in scope. Judgments concerning the principals’ sorts are not applicable. The purpose of this study was to simply find out if the principals could be placed into groups or factors that seemed to make sense. Knowing the backgrounds of the participants allowed me a different lens at which to look at the analysis that many researchers may not get when conducting a Q-sort. It allowed me to understand why I think the participants grouped the way they did.

Citations

Brown, S. R. (n.d.) The history and principals of q social sciences methodology in psychology and the social sciences. Retrieved Nov 2, 2005, from http://facstaff.uww.edu/cottlec/QArchive/B.

Brown, S.R. (1980). Political subjectivity: applications of q methodology in political science. London: Yale University Press.

Daniel, L. G. (1990). Operationalization of a frame of reference for studying
organizational culture in middle schools (Doctoral dissertation, University of New Orleans, 1989). Dissertation Abstracts International, 50, 2320A.
(UMI No. 9002883)

Hair, J., Tatham, R., Anderson, R., & Black, W. (1998). Multivariate data analysis. 5th ed. New York: Prentice Hall.

Kerlinger, F. (1973). Foundations of behavioral research. 2nd ed. New York: Holt, Rinehart and Winston, Inc.

Nesselroade, J., & Cattell, R. (1988). Handbook of multivariate experimental psychology. 2nd ed. New York: Plenum Press.

Porcerelli, J. H., Cogan, R., & Hibbard,S. (2004). Personality characteristics of partner violent men: a q-sort approach. Journal of Personality Disorders, 18(2), pg. 151-162.

Stephenson, W. (1953). The study of behavior. 2nd ed. Chicago: The University of Chicago Press.

Woosley, S. A., Hyman, R. E., & Graunke S. S. (2004). Q sort and student affairs: a viable partnership?. Journal of College Student Development, 45(2), p.231-242.

Always Asking Questions & Always Learning

*

Working on the University of Florida’s College of Education Online M.Ed. in Educational Leadership has provided me a golden opportunity to learn more about Florida’s educational leaders. The last few years of my career have led me into very divergent, but exceptional, learning opportunities. From leading the development of curriculum for online courses to setting up methods for large-scale registration and submissions for district-based inquiry, I have not been able to rest much on what I have learned previously in my career. I am constantly in challenging (but insanely exciting) situations.

With the Online M.Ed., I have been given the chance to search out and interview principals at all levels of career experience to be included in the courses. I believe this “real-world” perspective from leaders in widely varying school contexts provides the students with an extraordinary unique advantage. It has provided me something extraordinary as well. Next to finishing my dissertation and teaching my elementary and high school students, learning from these wonderful leaders has been the best part of my career in education.

Image

The leaders pictured include (L-R): Hudson Thomas of Pompano Beach High School (Broward County), Roxana Herrera of Palm Springs Elementary School in Hialeah (Miami-Dade County), Dr. Joseph Joyner- Superintendent of St. Johns County Public Schools, Lynette Shott of Flagler-Palm Coast High School (Flagler County), Scott Schneider of Terry Parker High School (Duval County), and Lawson Brown of Charles Duval Elementary School (Alachua County). These are only a few of the leaders we have interviewed.

The cover stars of the flier below are two exceptional leaders: Christy Gabbard and Stella Arduser of P. K. Yonge Developmental Research School at the University of Florida. They are also featured on our website now (https://education.ufl.edu/edleadership-med/).

Image