The Rules of Adjunctification – or What I learned I had in common with Oliver Twist

This post has been a long time in the making.  A very long time — some 15 years, to be precise.  I started “adjuncting” while still in graduate school and have worked, steadily, at 14 different universities in Ontario and British Columbia since that time.  Heck, I’ve even worked in the UK.   My friends and colleagues joke I am the quintessential “itinerant professor” and they are right – but I have had to become so in order to survive.  So when I say that what I have to draw on in terms of experience is wide-ranging and I think I have the ability to make some general observations based on what I have seen and experienced during my time running to and from various institutions to teach everything from Western Civilization classes of 425 students to fourth year seminars of 10 students, believe me, I am not exaggerating.


During this past week, I have had a long-overdue epiphany about the nature of precarious work in academia and the rules of Adjuctification (largely unspoken) that apply to those of us who man the trenches of undergraduate education.  I know I have always been the optimist and that I have always argued that at least by piecing together various courses at a number of universities in any one academic year, that I have been able to continue what I love to do while waiting for the opportunity to stay still: waiting for the opportunity for that elusive permanent, full time job.  This week, however, I came to realize that if that job is posted, even if I work in the department posting it, that job will never, ever be for me.


So for the last few days I have, in some sense, been working through a particularly academic version of the various stages of grief that accompany a personal loss; in my case, a personal revelation (how appropriate for Easter weekend, yes?).  I am not sure how many stages there should be when one is mourning the loss of a career you have given a good deal of time, effort and money to prepare yourself for, but I am sure there are at least a couple.  During the past week, I came to realize that these stages correspond to, or at least result in part from, what I am calling the Rules of Adjuctification. For the record, I have never really been in denial about the realities of teaching as an adjunct or, as we call them in Canada, a sessional lecturer.  You are contracted, course by course, term by term.  There is never any promise of future work (Collective Agreements and contracts tell you this in blunt legalese), and your place on the seniority list is no guarantee of being given any in the future (in fact, many institutions place a “shelf life” on your seniority to a year or two if you don’t continue to teach at that institution).  I have always cast my net widely to make up for such problems.  No courses at one university? No problem. Work for another.  I have bargained with myself that at least I was doing something I love and while I was not being paid anywhere near what my full time colleagues were getting, at least I was making contacts at all these places and that was a good thing, right?


Well, I have accepted that it might have been a good thing to be employed in a field that I love – and, for the record, one that I am very, VERY good at – but it isn’t to allow myself to be treated in a manner that, at its core, reduces me to a second class academic citizen.  I have accepted that I am not as angry as perhaps others will say I should be, but neither am I content to sit back and say nothing.


Before I continue, it must be noted that, at least in Canada, the number of part time, sessional lecturers in universities has grown to mind-boggling proportions.  While some universities try to hold the number of adjunct/sessionals in the over all institution to an AVERAGE of 30%, at others, it is clear that over half of the faculty is part time.  The process of “adjunctification” is something that is more of a tidal wave, than a slow creep and is in need of more serious consideration throughout Canada and the United States.


So, based on many years of hands on research, here are my Rules of Adjunctification.  They have never been written officially down and likely never will be by any institution – unless you include the “we are not obligated to hire you again” language you find in Collective Agreements and on the very contracts they ask us to sign which outline the terms of our labour.  Many of these rules are unspoken but are there – and we in academia live our lives by them.  They are the Big Pink Elephant in the department.  In essence, they make every adjunct or sessional lecturer an “Oliver Twist” – we are given what the institution thinks we deserve and we are obliged to take it with a smile and be thankful (regardless of the rotten conditions or amount of grueling work) and to say “please, Chair, can I have some more work” in the meekest, mildest way when we would rather demand our rights.


I must also state here that all of my teaching experiences have been in unionized environments, so I know that my experiences may be quite different in parts from those of others who have not had that opportunity.  Sad to say, however, that being in those environments has not really done me much good.  In fact, that was the main reason I became involved (two years ago now) in union work advocating on behalf of adjunct/sessional staff.  I note this only in the spirit of full disclosure, not to suggest that what is presented below is in any way a “union call to action.”  Trust me, some of the unions I have belonged to in the past were quick to throw part time staff “under the bus” in labour negotiations, especially if something could be acquired for full time faculty.


My list is not exhaustive.  You will likely have your own to add and I would be pleased to hear from readers as to ones I have missed.


  1. You are a warm body, contracted only for a short time. We owe you no loyalty at all. – This is the one rule that governs all others.  You matter to your department only so long as your contract is valid.  While they will reiterate they owe you nothing past the date of your contract end, remember that works both ways.  You are not beholden to them, either – and there may be some solace in that.


  1. Related to the above rule is this one: We will hire you when we please – whether that is 3 months or 3 days before the start of term. And we will expect the same preparation and performance from you, regardless of what timelines you are given. We will not be entirely pleased when you turn us down citing “other work” when we hire you at the last minute. We have a long memory and will remember your refusal but not the good you have done and will do for the department.


  1. We expect you to act and to produce/teach in the same manner as our full time faculty. You will be given minimal or no help in doing so. – Whether you are hired well in advance of term (something that happens rarely in the academia) or whether you are hired with merely days to go before the start of classes, the department in question will likely expect you to spend your own non-contracted (and thus, non-paid) time to put together the course.  You will not, as your full time colleagues do, get time worked into your terms of employment for curriculum development.  They will expect you to work before your contract starts (developing the course or coming to “orientations”) and, if there are any deferred exams or papers, they will expect you to work after your contract ends – often without payment.  I have worked in departments where this is not the case; and I am grateful to those few institutions that realize my time is valuable. But these were few in number and more prevalent is the idea that “as you work for us as faculty, we’ll expect exactly the same of you as our full timers.” The idea is, of itself, an interesting one.  The reality is far different.  If I am to be treated as your full time faculty, I need to see that reflected in the terms of my employment: give me a paycheck that reflects my training, experience and expertise (which is often equal to and at times greater than my full time colleagues); give me paid time to prep my courses; provide me with some form of office space so that I can see students and discuss their concerns in a professional way; give me library access ahead of my course so that I can do that prep and so that I can see what the university has in its holdings in order to prepare thoughtful and workable assignments for my students; pay me enough so that you have my full and undivided attention; or, if you can’t compensate me adequately, please realize that you will not be the only institution I will be working for and do not get upset when I cannot do what you want as quickly or as efficiently as you would like me to.


  1. You will carry the same student number per class as regular faculty. You will have the same duties and obligations.  We will pay you a fraction of regular faculty salary without access to benefits and/or professional development funds.  You may complain, but we reserve the right to argue that “you knew what you were getting into” in response. – I think this one needs no explanation.  I have experienced it a number of times.  My favourite was at a university in Ontario where, over the course of the first week of term, my teaching load grew from 1.5 courses to 3.5 and was essentially one tutorial group short of a full time load.  (A tutorial group would equal an hour’s worth of in-class time, and grading marking for around 20 students.) When I asked the chair to consider giving me one more group and thus make me a full-time contract employee (if only for one term), I was told that it was not possible (even though I knew there was one more tutorial to be assigned – and that they were desperately looking for someone to teach it).  He added if he knew I would be making such a “ridiculous request” he wouldn’t have assigned me the extra work in the first place.


  1. We will praise you only when you do something worthy of being put on the department’s web site and/or worthy of being put in a departmental report to administration. But we will do little to help you get the funding and/or opportunities you need to be able to do something worthy of our attention. So don’t hold your breath.  Given the culture in academia, it is not surprising there is an increasing emphasis on things deemed to be “more important” than teaching, like giving interviews in the media on “trending topics,” winning a fellowship or a book prize, publishing a monograph with a “significant national” publisher.  If you are a part timer, chances are you are not eligible for funding from national granting organizations because you are not full time.  And I worked at only a handful of institutions that gave anything in the way of professional development or research grants to part timers. Many of those institutional grants were welcome, but they were a fraction of what was given to full time staff and involved restrictions on dates and usage (want to go to a conference that happens AFTER term ends? No way. And many conferences happen after term ends.) Moreover, part timers who win teaching awards (usually given out by the institution or by the student associations) will likely never have that recognized by their department, no matter how loudly the same department would crow about that same award if it had been given to full time faculty.  Adjuncts will never be trotted out as the “poster child” for a department’s success, for it’s ability to connect with the community – despite the fact that many adjuncts are doing great things with next to little, if any resources. I have come to believe that departments don’t celebrate their part time faculty’s achievements because doing so would likely raise the question they do not want to hear: “So why don’t you give them a full time job?”


  1. If you apply for a full time job in the department we will consider your application but not with any view to hiring you. We may decide to interview you out of politeness but we will never consider you a serious candidate. – Rest assured that no matter how high your student teaching evaluations are, your application will be judged by those very things your employer department and institution have come increasingly to value: publication, research and dissemination of research. I once had an interesting discussion with a chair of one of the departments I worked for on this very issue.  It was, in many ways, a wonderful department to work for but the chair admitted to me that while they valued my teaching, it be at my “scholarship” that would be the deciding factor in any full time job search in that department.  When I pointed out that the department did not provide any support for me to do that work, and that the pay at their institution was such that it obliged me to find additional work and thus limit the amount of time and funds available for research/writing, he had the decency to admit that it was a failing on the department and institution’s part.  But he did add, however, that they couldn’t just hire excellent teachers as that “wasn’t industry standard.” I have heard from colleagues who are given interviews for full time jobs in department in which they have worked as a part timer, and their experience was eerily similar to mine in the same circumstances: a lecture/job talk that was so sparsely attended by faculty as to suggest an obvious lack of interest, no “free meal” (unlike out of towners who were given the full treatment of expensive lunch and dinner) and a discussion with the Academic Dean which focused on “what the university needs in the future that wouldn’t really be served by your being hired” rather than the individual’s qualification and what they could bring to the university/department.


  1. We will publicly, and even in conversation with you, express the conviction that the plight of adjuncts / sessionals is an important issue that the Administration needs to address. We will, however, take no action on our own to address any of the issues you raise.  After all, we’re not the administration and it is their responsibility, not ours to sort this out. – To some degree, this is true.  Departments do not make policy; that is the institution’s purview.  But that should not give them a free pass to carry on policies that put an increasingly significant part of their faculty at decided disadvantage.  This approach always reminds me of a particular scene in the fabulous classic move Mrs. Miniver.  A conversation about how to bring about change in the world turns to the question of action vs. words and one character suggests that talking alone won’t do anything – “a little bit of action is required now and then.”  I think departments should take that to heart – because it is very likely that if they and the institutions they are part of don’t take action, that their part time lecturing staff will.


5 Things Humans can learn from Zombies

Because it has been one of those days/weeks. . . . and because, underneath it all (or underneath what they have left of it all), zombies are people too.


Five Things Humans Can Learn from Zombies

There are many people in this world who think that zombies are a menace, pure and simple. These doom and gloom types are quick to point out the destructive and murderous tenancies of the undead and this leads them to suggest that we humans must meet murderous force with murderous force. They claim that only such an approach will ensure the survival of the human race.

This advice may, at first glance, seem sound. I know that if surrounded by a horde of the undead, I would likely not try to engage them in a debate about Shakespeare or politics – in all probability, my survival instincts would take over and I would run as far and as fast as I could from said horde. But just because I do not want to be a zombie, does it mean that I cannot appreciate zombies and the contribution they can make to our world if we allow them to do so? I will suggest that, somewhat ironically, those who think the only suitable responses to the undead are violence and mayhem are ignoring the example the undead can set for human kind and also some pretty darned important life lessons. Before the nay-sayers among you think I have lost my mind, let me explain what I mean.

As a human individual’s social and philosophical worth is relative, so too is that of zombie kind. One argument would suggest that all we can learn from zombies is that there is a living hell. I will concede that being a zombie is not all fun and games – unless you consider drooling, decomposing and eating brains and viscera fun. (And some of you may, I cannot tell. So I will leave that point for a later debate.) So it can be conceded that, given the present evidence, being a zombie is both physically and socially limiting. People shun you, run from you and, without any thought as to how you may feel about it, try to bash in your head. Society considers you to be no better than vermin. Forced into such a situation by chance (as very few people become zombies by choice), it is hard not to see why some people suggest that zombies should be hunted down and disposed of as quickly as possible.

But in a world where we often give lip-service to equality and inclusion, society’s aggressive stance towards zombie-kind can appear a bit ungracious, at best. Instead of limiting ourselves to merely thinking of ever better ways to kill a zombie, we non-zombies should be trying to see the potential in our relationship with the undead to enrich our existence and even, perhaps, that of the undead.

So, in order to foster a more open and zombie-positive dialogue on this very issue, I have provided a list of what I feel are the five most important things which humans can learn from our undead cousins. Of course, that the dialogue may take place only among us non-zombie types as the undead’s ability to reason, speak and argue is severely restricted and is limited only to the ability to grunt, groan and wail. (For some people in some regions of the world, particularly those directly involved in politics, this situation may not seem too much different from what they already experience.) But I would urge we non-zombies not to let that stop us from at least trying. Despite what we may think, zombies have much to add to the consideration of this question and I am sure that, if they could express their desires in a clearer, more lucid way, they would want to participate. Of course, until such time as the art of zombie grunt and moan translation develops a bit more fully, we must take on a greater share of the burden if zombie-kind is to be better appreciated by humanity.

Please remember that the points listed below are meant to be jumping off points for serious discussion among those who both deal directly with the undead and those who wish to contemplate the more philosophical and esoteric aspects of human and zombie existence. Remember, there is a rather symbiotic link between humans and zombies; they could not exist without us. Their origins are human and we must respect that. And for those who still cannot see the benefits of this approach, think of it this way: if there ever is an all out war between humanity and the undead wouldn’t it be nice if we humans could come to the table with more than just guns, bazookas, mortar shells and other deadly artillery? Nothing says “We really want to work this out,” like trying to understand the other guy, even if he is a walking corpse bent on eating the contents of your cranium.

Five things Humans can learn from zombies (in no particular order):

1. It is a small world, after all.
It is always important to remember that, whether you want to admit it or not, the zombie trying to eat your brains was once just like you – human. That zombie worked for a living, wanted better things for his or her children and likely, had the places been reversed, would not be as quick to run from you. (Well, maybe not as quick.) In fact, while you are planning your escape route from the approaching zombie horde, take a minute to consider whether or not there might be one or two of your old friends, relatives, neighbours or drinking buddies in that horde. Make your mother proud – even if she is one of the horde – and empathize with the undead. Try to see the world from their side. Remember: we are all essentially the same under our human skin – even if that skin is fetid, rotting, falling off and gangrenous.

2. Acceptance will set you free!
As you are able to read this, I will assume that you are not yet a full fledged zombie – that you have either managed to remove yourself from an infected area, have been successful in so far escaping from the horde, or have been bitten but you have not yet transformed fully. (Indeed, if the last scenario applies to you, I applaud your efforts to use your last moments as a more rational being to enlighten your social and cultural zombie awareness!) But, should the unthinkable happen, should you find yourself “converted,” so to speak, I advise you not to fight the transition. Accept your new lot. Try to adjust your mind set to a more positive way of thinking about what will come. Yes, you will now crave brains but you cannot avoid it – so embrace it! See it as not the end of your human life, but the beginning of your zombie life! Nothing will get you through the drudgery of decomposition and brain obsession like being one with the program!

It might help to remember that the zombie world is a simpler and much less stressful one. Their needs and their approach to filling those needs are simple: find brains, eat brains, repeat. Some of you might belittle such linear thinking as unimaginative; you would be wrong. Is it really better to spend countless hours obsessed with ideas, worries, conjectures and fears that serve only to complicate rather than simplify your life? Do like the zombies do – zone out, simplify and let your life flow as it should. Think of it this way: have you ever seen a zombie that is as stressed out as you used to be at work? Perhaps there is something to be said for your new altered existence. Accept and relax!

3. Being able to adapt to strange and unique situations really isn’t so hard if you put your mind (or what is left of it) to work!

We humans underestimate the determination and drive of the undead. Those who will be telling you to be afraid of the zombie ability to keep going no matter what (and there will be those who will refuse to be enlightened), are missing the point. Zombies are so determined, that they let nothing stand in their way. They are, simply put, nature’s most goal oriented creatures! To acquire the brains they so crave, they will adapt to whatever obstacles are placed in their path – both physical and metaphorical. A zombie never says die! (Ironic, isn’t it?) While we may not be able to appreciate their dedication and zeal as fully as we would like (as prey often times does have difficulties feeling for the thing hunting it), we must admit that we as humans are a lazy lot and many of us take the easiest route to acquiring what we want. We may aspire to do better, but often we do not. Zombies, then, can be a true role model when it comes to focus, dedication and the ability to adapt to whatever obstacles we face. Be as determined as a zombie, and whatever you want can be yours – even if what you want is brains.


4. Community is more important than the self.
We speak of the “zombie horde” as if it were a bad thing. Of course, if you are facing one, you are unlikely to feel the experience is positive, but I am talking in terms of philosophical debate here, not in terms of the prey and the hunter. Zombies are pack animals and their very survival depends on their integration with the pack. Thus they live in a symbiotic relationship with their community: they work with it and respect its needs, desires and goals.

Present day human society places a high premium on individualism, even when its media culture is trying to convince us to think the same way, and to buy the same brand names. Yet, deep down, we non-zombies are pack animals as well. We try hard to cover the fact that we need to be connected to other humans. I do not speak here of the “connections” that have come with the advent of social media (as “liking” someone on a web site or carrying on a conversation purely by text is not, to my mind, truly a type of human connectivity) but of person-to-person contact, the sort that can only be had by going out, socializing and actually speaking, face to face and interacting with other humans.

Ask yourself this question: if the time comes and you undergo the “transition” to zombie life, will you know how to interact with others? Will you know how to be a team player? If so, then you will survive to hunt brains and infect others! If not, you will be the weakest link and will slow down the horde – and may become one of its victims.

Caring about what happens to those who walk beside you, even if they are walking in a search for brains, makes the zombie’s existence just a bit more humane. Can we say that our modern society measures up?


5. Brains really are the most important thing in life!
This is the most important lesson that humans can draw from the example of their undead cousins. We in modern society marginalise those who celebrate and enrich their brains (unless they make a fortune in video game or computer software). Zombies, though, have long understood what present day human kind appears to have forgotten – we are nothing without our brains! For zombies, as for we non-zombies, brains are nourishment. For them, they are a finger food; for those of use not yet transformed, they allow us to grow and feed our thoughts and intellect. Our brains are key to our survival, in both the present and in our potential zombie future. We who have yet to join the horde forget this at our peril. Ironic, isn’t it, that if we do become one of the undead that the importance of that mass of grey matter we have neglected and mocked for so long will, once more, become central to our lives? Perhaps it is the last piece of poetic justice – and perhaps it is the one aspect of the zombie existence on which we can find common ground. Brains really are what counts!

I hope that these points will engender useful bi-partisan, human-zombie discussion and bring us towards a more common understanding of our humanity. Just remember: inside every civilized non-infected human is a zombie just waiting to get out and hunt for brains!


Students say the darndest things, Part II

Last week, I posted some of the more unique and interesting interpretations that students in my classes have submitted in their essays and exams.  Last week’s post dealt with Antiquity to the Early Modern era — this week’s covers the Age of Revolutions to the Twentieth Century.

Remember, these are taken verbatim from student submissions.  Enjoy!



The French Revolution and its aftermath

At the beginning of the 18th century, France was a wealthy yet slightly disorganized country. The beginnings of the Revolution began a small problem for the monarchs. The Napoleonic period and the French Revolution happened in 1787.

For 10 years leading up [to] the 19th century, France had no future but had entered the modern age nevertheless. The events prior to the French Revolution had led up to the French Revolution and after this period a lot of stabilizing had occurred. Although the poor come across as guided dogs in many areas of history, the peasants of the French Revolution displayed the greatest capabilities of the unintelligent. By 1789, the masses were psychologically prepared to burst into the area of politics. Peasants now had a new-found confidence as the rioting was beginning to cause awareness all across France. Many peasants were in worse condition after the French Revolution, thousands were dead.

The Nineteenth Century

The Industrial Revolution sprung forth many changes to the world. The [industrial] revolution overturned the status quo and catapulted the growing masses of the middle class. The middle class emerged with the parliament.

The development of industry and technology in the 18th and 19th centuries allowed states to develop powers which would allow them to effect political situations and eventually lead into a new age of political thought as no longer were enough to allow for the dignified survival of people. Up until the Crimean War, technology was not taken very seriously, but it was during this period that the steam ship and shell fire was recognized. This clearly sparked the rate at which technology was integrated into war.

Hence, difference being the main element that held nationalism together, can only lead to or inevitably become a form of racism if only because all nationalistic groups consist of prerequisites that are not the same.

Evolution had been floating around but Darwin was the first to explain how it occurred. The theories Charles Darwin introduced were extremely stunning.

Throughout history naval warfare has come in waves, but during the late 19th and early 20th century naval power would have many shifts between the British and Germany. In building such a large navy [in the late nineteenth century], Germany becomes a threat to England, for England is seen as invisible on the sea. German forces would come together in the late 19th and early 20th [centuries] to battle with the British for superiority of the water.

The Twentieth Century

Europeans showed enthusiasm for World War I because they didn’t know any better. War was no longer between men. It was between weapons of destruction and machines. Due to the fact that Germany forcibly accepted full responsibility for causing the war, they had to pay repercussion to the exhausted but vicious ally forces. The total amount was outrageously out of the Germans price range.

Left at home while men went off to fight, women became independent and self-sufficient. This, coupled with the fact that the suffragist movement had finally secured the vote, encouraged the birth of the flapper. Women and men became more pernicious and were in theory more sexually liberated. Cross dressing was in fashion and homosexuality more accepted. Indeed, culture was more receptive to new ideas and alternative to democracy which they understood to have failed. This mentality set the stage for both the emergence of Nazism and Fascism.

Nazism is a religion, well kind of like a religion. The Triumph of the Will also showed how the Nazis loved a parade and to be involved in a good time. The energy and sudden neurosis that emitted from the civilian and military crowd when they saw Hitler was a better proof on how much the people of Germany had attached themselves to Hitler and the Nazi party. World War II was a massive, high-scale killing machine and Hitler was driving it.

England associated Churchill with a lower standard of living and conflict. Thus [after World War II], many people looked to the Labour party for salvation as it was thought that although Germany was the root of all evil, it was run well economically. Thus, the labour party headed by Truman came to power as a majority.

Stalin was well aware that the US had a nuclear bomb, and, thanks to their previous tract record, was confident that they were willing to use it.


Next week: The 5 Things Humans Can Learn from Zombies. Seriously.

Long time no blog….

Like many, I have made New Year’s resolutions.  Like many, I will undoubtedly not keep all of them . . . but the “post more on your blog” resolution I will try to keep.  Here goes.

I thought we could start the New Year by celebrating how students of History can, sometimes, put a unique “spin” on their perception of the past.  What I post below I cannot take full credit for — these wonderful and sometimes amusing ideas were taken from essays, exams and other submissions students have made in my various courses.  It is a list I have been compiling for over a decade. Some of the observations are quite profound in their own way (I mean, Charlemagne really was the “scaffolding of the Middle Ages” if you think about it) while some . . . well, head shaking may be experienced.  (I know I still wonder, at times, what I must have said in lectures to spawn some of this.)

I give you a sampling of the pre-modern observations . . . From the Romans to the dawn of the 18th century.  As written by students. All text is reproduced as it was found in essays and exams.

They are divided by period or “historical development,” where possible.


Ancient Rome

After naming himself dictator, Caesar had a long road ahead of him.

The sacking of Rome put a lot of stress on the relationship between the Romans and the Goths.

The Middle Ages

The Empire of Charlemagne was the scaffolding of the Middle Ages. It was easier for the public to believe that Charlemagne had the right to the throne because he was a very tall leader.

William’s conquest completely transformed the fabric of English fabric forever. The conquest streamlined the common people. Slaves were elevated in class while surfs lost ground in the social ladder. More and more free-holders were gobbled up by the manor lords.

The eleventh century was also the period in which Christians fighting under the Pope was common.  The First Crusade set a lot of standards for the Crusades that were used by subsequent Crusades. All in all, the powerful impact and success of the First Crusade was so tremendous that the following crusades were not able to follow in the same footsteps. The crusades ignited many knights to take up the cause of the Lord and be as holy men. Many religious orders were made by this religiously pumped men.

One of Innocent III’s most popular personality traits was his ability to constantly interfere.

The Black Death ravished around Europe.

The Renaissance

Independent thinkers [in the Renaissance] were not considered as crazy as they were in previous eras.

Early Modern Europe

In the beginning, or rather 1500, a single monarch was placed in control of a country. Most monarchies are hereditary and not publicly appointed. It is believed that there was great absorption of knowledge in the 15th century.

The discovery of the New World was essential in the history of European expansion and in the development of the world. Christopher Columbus was a real trooper, dealing with the conditions that he and his men faced on his journey, with the sea sickness, the lack of variety of food and heating. When Hernan Cortes showed up [in South America], the Aztecs at the time were waiting for one of their Gods to show up, so they let him in. Further North of Cortes was New France (“Canada”) which showed a different type of colonial warfare as it was taken over by the British after the Plaines of Aberham [sic].

The explanation that has been most often used and held account to the decline of Spain is a series of faults made by the Spanish.

The Reformation

The Protestant Reformation took off by a man by the name of Martin Luther King. Erasmus gave Luther the umph he needed to get things going. Martin Luther’s initial rejection of the ways of the Catholic Church influenced him to make suggestions by nailing his problems to a Catholic Church door. [John Calvin] . . . was educated more than Luther and believed in “Pre-Destiny” which means everything that happens in already decided by God, in essence don’t worry about it. Calvinism as the Protestant movement became to be called spread a lot farther than Lutheranism because Calvin was a lot neater and more organized than Luther so Protestantism was easier understood

The Catholic Reformation grew out of the Protestant Reformers.

The Reformation was important in creating nationalist conditions. Religion was no longer standard.



Early Modern England

Henry [VIII]’s marriage was on a downward path from its conception.

Perhaps if Henry was raised during the time Elizabeth was raised he may have been a Protestant. Protestantism wasn’t very widespread yet.

In 1685 James II succeeded. James campaigned for Catholics. He started off slow but he eventually had Catholics appointed in the Privy, and high offices in the state.

The Glorious Revolution happened in England during King James rule.  As we know England was an Anglican country and they did not appreciate the fact that their King was turning Christian.

James may be seen as insane. After all, with all the anti-catholic sentiments still running strong in the nation, who would in their right mind push for Catholicism? Using religion as a battle ground to establish was a big mistake committed by James. He gave parliament a reason to legitimately erode him.

Early Modern France

Charles VII was the ruler who kicked chivalry in the butt.

The Edict of Nantes gave political rights to the Huguenots that were not compatible with the road that France was going down.


Next post: from the 18th Century to the present!

Welcome to Professor History Geek’s blog!

That’s me, “profhistorygeek.”  It’s not just who I am, but it’s what I am as well.

I am an historian (I have a PhD from the University of Toronto, if that makes a difference) and I am rabid for all things history.  Right now, I teach as a sessional or adjunct (part-time) professor at a number of universities in and around Vancouver and the Fraser Valley. I spent the decade prior to that teaching at just about every university that you could possibly commute to from the Toronto area via Greyhound bus, VIA train and/or Air Canada flight. Like many who have run the gauntlet of graduate school and survived/graduated, I am still in search of the elusive tenure-track job.  But in the meantime, I keep busy with teaching, mentoring students, writing historical zombie fiction (yes, historical zombie fiction) and doing union work advocating for part-time professors’ rights.

I will be using this blog to post about history matters, teaching experiences and ideas (and frustrations) and about various other history and geek-like things that catch my eye. I will also, from time to time, be blogging about more general stuff: you know, the sort of day to day stuff that gets us interested, angry and ready to say something public.  So don’t be surprised if you see a blog post about politics, politics or politics!  I may even trot out some of the zombie historical fiction!

I will be posting my first “real” post later this week (when I find time in between teaching, preparing for next semester and normal life chores).  But here are a few things you need to know about me before you start reading anything I might post.  Call them “trigger warnings” if you will:

  1. I love a good debate.  And by “good,” I mean one that all ideas and opinions are welcome, so long as they are expressed in a respectful manner.  Being able to voice an opinion carries with it a reciprocal responsibility to listen thoughtfully and attentively to the other side of the issue.  Debate is lively and mutually respectful.  Trolling is not a form of debate.
  2. My main area of research is war, politics and society in France and England (and their colonies) prior to 1815, so there may be some posts that talk about these subjects.  This does not mean I am a war-monger, neo-Conservative or some sort of historical trog who just doesn’t get that “history has moved on” and worships the military complex.  I am none of those things. I am simply someone trying to understand the workings of power through social, political and military capital in an age where war was a constant.  I find it fascinating but also realize that others may not.  And if I can help to dispel the idea that all military history is simply tactical history, so much the better.  There is a socio-cultural history of war that badly needs to be written and I hope to help bring that about.
  3. I am not sure I believe that people of the 15th century (as an example) should be chastised for not holding to or acting in such a way that reflect the values of those 21st century people and historians who study and discuss them. I am not sure it is entirely sound historical methodology. But I am interested in hearing differing views on that . . . (see point #1).
  4. I love teaching and helping my students to learn and grow as people and scholars.  There will be a great deal said about my work with students here.  I will speak only in the general, however, to preserve their privacy. No students will be named or harmed by this blog.
  5. I may say some things — pointed, blunt and/or downright rude things — about the treatment of adjunct/sessional/part-time professors in higher education.  Be prepared.
  6. I am a vintage toy collector, so there may be some of that mentioned.
  7. I am a fan of the Oxford comma. Be forewarned.

Thanks for reading. More to come!