Approved by Faculty Senate December 2, 2002




Department ________Psychology______________________                 Date _____10/17/02__________  


        ______PSY308______        ___Experimental Psychology_________________________         ____3______

        Course No.             Course Name                                                             Credits


        This proposal is for a(n) ___X__ Undergraduate Course


        Applies to:         ___x__ Major                          ____x_ Minor

____x Required                                                 _____ Required

_____ Elective                                                 __x__ Elective


        University Studies (A course may be approved to satisfy only one set of outcomes.):

Course Requirements:

                    Basic Skills:                           Arts & Science Core:            Unity and Diversity:

                    _____ 1. College Reading and Writing           _____ 1. Humanities                   ___x_ 1. Critical Analysis

                    _____ 2. Oral Communication               _____ 2. Natural Science            _____ 2. Science and Social Policy

                    _____ 3. Mathematics                _____ 3. Social Science                  _____ 3. a. Global Perspectives

                    _____ 4. Physical Development & Wellness       _____ 4. Fine & Performing Arts           _____     b. Multicultural Perspectives

                                                                                    _____ 4. a. Contemporary Citizenship

                                                                                    _____     b. Democratic Institutions

        Flagged Courses:      _____ 1. Writing

                                    _____ 2. Oral Communication

                                    _____ 3. a. Mathematics/Statistics

                                    _____     b. Critical Analysis


Prerequisites             ___________ PSY 210, PSY 231, concurrent enrollment in PSY 309________________________________



Provide the following information (attach materials to this proposal):


        Please see “Directions for the Department” on previous page for material to be submitted.



Attach a University Studies Approval Form.


Department Contact Person for this Proposal:




________Carrie Fried_________                       _457-5483__             _______cfried@winona.edu__

Name (please print)                               Phone                    e-mail address






Course Description and Syllabus for P308  Experimental Psychology


Instructor: Dr. Carrie Fried

                Office: 231 F Phelps Hall, Phone: 457-5483,  email:,

Office hours:   M & W 11-12 & 1-2, T & Th 10:30-11, 2-3 & , F 11-1


Course goals and objectives: 

1) Learn the basics of experimental research in psychology

2) Learn to communicate research findings

3) Learn the logic behind well designed and conducted research

4) Learn the technical issues of designing and conducting research

5) Learn to critique research and spot flaws in research designs & conclusions,

6) Learn to interpret statistics and research findings,


This 3-credit class is designed to be accompanied by P309 Experimental Psychology Lab.  You should be enrolled in both P308 and P309.


Class/Lecture: Class will be spent primarily in lectures, going over material and answering questions.  If you miss lecture, you are responsible for getting lecture notes from a fellow classmate.  Keep in mind, material in this course builds on itself, so if you are confused by topics covered in lecture, be sure to ask questions. 

Homework and reading assignments will also be part of the “class” portion of the course.  The Assigned Readings (r-1 – r-8 on course calendar) are research papers picked because they highlight aspects of research we are focusing on.  There will be specific questions to answer regarding the readings (questions will be typed on the cover sheet).  Answers to these questions should be typed (they will be at least a page long) and brought to class.  Often we will discuss these in class.  These readings will be available at the library reserve desk at least a week prior to due date.  If you want them even more in advance, see me. 


Grading and Evaluation:


Class points will primarily come from exams and assignments designed to assess how well you understand the course material.  Points will be broken down as follows:


Exams (2)                                               116 pts (each)

Reading reports (8)                                                            48 pts.(total)

Homework                                             20 pts.

                                Total                                 300 pts.



This course is a University Studies Program Unity and Diversity Critical Analysis courses:

Such courses are required to meet this following outcomes:

A: Promote students' ability to evaluate the validity and reliability of information:

B: Promote students' ability to analyze modes of thought, works, arguments, explanations, or theories:

C: Promote students' ability to recognize possible inadequacies or biases in the evidence given to support arguments or conclusions:

D: Promote students' ability to advance and support claims:

These letters are used in the course calendar, course objectives, & throughout the syllabus to indicate places in the class where these outcomes are met.


Course Calendar (may change slightly as semester progresses)

  Week                       Readings (in text)                           Assignments Due

                                        (&r Univ. Studies outcome)

      1 Sept. 3-6    1-11, Ch. 2

      Class  Intro to class, research,           ( a,b)


      2: Sept. 9-13      80-90, 174-185, 222-228, 233-238     r-1 Friday

      Class:  Intro to experimental methods                      (a,b,c,d)


      3: Sept. 16-20           74-80, Ch. 5, 238-242   Append. E      r-2 Wed.

      Class: Intro Experimental, Data &  Stats              (c,d)


      4:  Sept.. 23-27    Append B               r-3 Wed.

      Class: Stats & Data  APA /Format                         (b,c,d)


      5:  Sept. 30-Oct. 4                        Ch. 11                     

      Class: between vs. within subject designs                         (a,c,d)


      6: Oct. 7-11 (No Class Friday)  Ch. 11                     

      Class Within Subject designs         (a,c)


      7:  Oct. 14-18      64-68                        r-4 Monday,.

      Class: Ethics, MIDTERM                  EXAM  Friday


      8  Oct. 21-25       Ch. 12                      EXAM  Monday

      Class:  MIDTERM, Factorials                     (a,c,d)


      9:  Oct. 28-Nov. 1                       Ch. 12                      .

      Class:  Factorial designs       (a,b,c)


      10: Nov. 4 - 8     pp 194-196, 202-207                             .

      Class: Demand Characteristic/Exp. Effects                       (a,b,c)


      11:  Nov. 11-15 (No Class Monday)  186-194  Ch. 9        r-5 Wed.

  Class  Validity issues,                            (a,c)


  12:  Nov. 18-22      Ch. 13, Ch.7           r-6 Wed.,

      Class: quasi-experimental & correlational designs                    (a,b,c)


      13: Nov. 25-29 (No Class W or F)  Ch 6 & 7                  r-7 Monday

      Class:  non-experimental methods                       (b,c)


  14: Dec. 2-6             Ch. 6 & 7                 r-8 Wed.

      Class:  non-experimental methods


      15: Dec. 9-13                                  

      Class Ethics #2, Review                  


      Final Exam Wed. Dec. 18th  8am



Psychology 308, Experimental Psychology, 3 cr.

Unity & Diversity: Critical Analysis course

Proposal and Rationale


INTROCUTORY NOTE:  This course was previously approved as a Unity & Diversity: Critical Analysis course.  The department is in reducing the credits associated with this course from 5-3, (and move the lab component into a separate course) and thus is required to put P308 through as a new course and resubmit it to the university studies sub-committee to be re-approved as a Unity & Diversity: Critical Analysis course.


Catalog Description: Introduction to the scientific methods and research techniques in psychology


General Course Information:

Psychology 308 (Experimental Psychology) is being proposed as a Unity and Diversity Critical Analysis Course within the University Studies Program. General Psychology and Statistics are prerequisites for the course.  The intent of P308 is to teach students the different aspects of conducting research in psychology.  These aspects include: 1) Understanding the logic behind experimental research. 2) Learning the basic methodological issues of different types of research, 3) Learning to critique research and spot flaws in research design, 4) Reviewing and applying skills in data and statistical analysis, and 5) Learning how to write-up an empirical research paper. Class-time typically consists of lectures supplemented by discussions, case studies, and examples to help students apply course material. 


Core topics covered in the course include: 

1) Scientific Thinking, 

2) Theory and Hypothesis Testing

3) Experimental Methodology

4) Issues of Control and Confounding Variables in Experimental Methodology

5) Types of Validity (e.g., Internal, External, Construct) in Experimental Research

6) Measurement Issues such as Validity and Reliability 

7) Within Subject, Between Subject, and Mixed Experimental Designs  

8) Factorial Designs

9) Statistical Analysis (Descriptive & Inferential Statistics) 

10) Writing in APA Format

11) Quasi Experimental Designs

12) Introduction to Non-Experimental Methodology (e.g., Surveys, Observations, Quasi-Experiments)

13) Non-Experimental Methodology Issues (e.g., Survey Sampling, Sample Bias, and Sample Error)

14) Drawing Appropriate Conclusions from Different Forms of Research

15) Psychology Research in Applied Settings. 


Specific Outcomes for USP Unity and Diversity Critical Analysis courses:


A: Promote students' ability to evaluate the validity and reliability of information:

One of the primary goals of this course is teaching students how to evaluate whether the information provided in psychological research is reliable and the conclusions drawn from the research are valid.  This outcome is addressed at several different levels and from many different angles.    For example, at a micro-level, these issues are addressed by teaching students to evaluate the validity and reliability of measures.  In measurement, these terms have specific definitions involving what the measure is actually measuring (validity) and how well – or consistently and objectively – it is measuring it (reliability).  This area has a long history in the field of psychology where the validity of important measures (e.g., intelligence) are often hotly debated and other measures (e.g., projective personality tests) are notoriously unreliable.  Students are exposed to these ideas through lecture and discussions of the validity and reliability of actual measures. 

At a more macro-level, replication in research (as a way to evaluate reliability of findings) is a central feature in basic scientific thinking and theory building.  Students are introduced to these ideas through lectures on the “scientific method”.  They also explore the issue by discussing examples of where replications (or failures to replicate) have altered theories or psychological principles.

Ideas of validity permeate this course.  A central theme in psychological research is dealing with the sometimes conflicting concepts of internal and external validity.  Briefly, internal validity has to do with whether the independent variable in an experiment (cause) is solely responsible for differences in the dependent measure (effect).  For example, does exposing an experimental group to violent video games cause those participants to behave more aggressively than a comparison group who were not exposed to the violent video games.  Or can the differences between the groups be attributable to other factors (e.g., individual differences, effects of simply playing any video game).  Students learn the fundamentals of internal validity by learning the basics of experimental research, studying previously conducted experiments, and designing their own experiments.   Understanding internal validity is key to understanding experimental research and research design issues.   Students learn to evaluate internal validity by looking for the factors that violate the underlying assumptions of an experimental design.  This is often done through analyzing and critiquing designs as explained in Outcome C on the following page. 

External validity, on the other hand, refers to whether research findings extend  to other populations (aside from those in the study) or are applicable outside the laboratory.  External validity is an important issue in psychological research because research findings often have real-world application, such as in the areas of mental health or organizational psychology.   Conceptually and logically, internal and external validity are independent issues; practically they are often conflicting in that most types of research either stress external validity at the expense of internal, or the other way around.  The course stresses that students understand the differences in types of validity and also how to evaluate the different forms of validity on experiments.   Students often learn about external validity and the potential conflict between the two forms of validity by examining processes of doing research in applied settings and / or applying laboratory research to solving real-world problems. 



B: Promote students' ability to analyze modes of thought, expressive works, arguments, explanations, or theories:

Because teaching students how to conduct psychological research inherently involves getting students to analyze research at many levels, this objective is addressed in many ways throughout the course.  The following are examples of the activities that address this objective.  1) In examining where research ideas come from, students learn that sometimes a psychological phenomenon is explained differently by various theories.   Students examine how competing theories may result in competing hypotheses for identical situations, and how a well designed study may be able to distinguish between the two or more theories. 2) Students read previous published research studies and report on the theories, methods, and results. 3) As is extensively documented in this proposal, students learn why and how specific aspects of research designs can rule out types of alternative explanations.  For example, students learn how certain control conditions must be used to adequately test certain hypotheses.  4) Students are exposed to various forms of research and examine how those forms of research are use to test different theories, provide different levels of explanation, and are used in different contexts.  For example, the level of explanation provided by a case-study or survey will be very different from the level of explanation provided from a tightly controlled laboratory experiment, even if they are both designed to study the same underlying phenomenon.   5) Finally, students are exposed to issues of research methodology through several different modes of thinking.  These include mathematical thinking about statistical analysis of data, logical thinking about research design, and conceptual thinking about theories and hypotheses.


C: Promote students' ability to recognize possible inadequacies or biases in the evidence given to support arguments or conclusions:

The primary goal of this course is to teach students how to do research correctly so that they can draw appropriate conclusions and support their claims.  Central to learning this lesson is learning how and why research done incorrectly may lead to faulty conclusions.  This is often done through critiquing research designs (often weak or flawed designs).  The primary goal in these critiques is to teach students both to recognize flaws and weaknesses in methodology and to understand how and why these flaws might draw into question the results and conclusions.   Several potential areas of weakness are stressed.  For example, when conducting experiments the lack of true random assignment of participants to conditions or the lack of adequate control of possible confounding variables (unwanted systematic differences between conditions) makes it impossible to draw causal inferences.  Students are taught how to spot these flaws but also to understand why they draw into question cause-effect conclusions.  Other potential flaws include incorrect balancing of the order in which trials are presented in within subject designs (experiments where subjects participate in more than one condition), incorrect controls of demand characteristics and experimenter effects (e.g., placebo effects or expectancy biases), using inappropriate control or comparison groups, or using invalid or unreliable variables in measuring outcomes. Again, the goal of these critiques is to teach students both to recognize flaws and to understand the ways in which the flaws may undermine the conclusions.  These activities also reinforce students understanding of the methodological basics involved in psychological research. This outcome is typically achieved through discussing published research, and assigning homework that require students to identify flaws in research designs.

Skills learned by these activities extend beyond the course-emphasized academic research to help students critically evaluate questionable research findings they are likely to encounter in their lives.  This includes so-called “pop-psychology” or “pop-science” that are often reported in the media or become the center of fads or social-movements.  Examples that are sometimes discussed in class include claims of “healing touch” or other forms of alternative medicine that have become centers of debate lately.


D: Promote students' ability to advance and support claims:

There are several places where this outcome is met -- such as teaching students to design experiments that will test specific hypothesis and rule out alternate hypotheses, and teaching students how to correctly interpret and report results of statistical analyses. 

It has already been discussed elsewhere in this proposal how students are taught the mechanics of experimental design and psychological research.  These skills also fit into this category.  Students learn how to design an experiment that will be able to support claims of causal relationships between variables.  At the same time, they learn how to identify and articulate flaws in designs that may nullify such claims.  As the class discusses quasi-experimental and non-experimental designs, students learn how and why claims of causal relationships become more and more difficult to support as alternative hypotheses become more and more difficult to rule out. 

Another area where this particular outcome is addressed is in the statistical analysis and interpretation techniques taught to students and used by students in the class. Students learn what statistical tests to use to test significance depending on the type of data (frequencies, continuous scales, etc.) and the type of relationship being examined (correlations between variables, differences between groups, etc.).  In essence, students learn how to support claims by using, reporting, and interpreting statistics appropriately.  They may learn, for example, how to support a claim that two experimental groups truly differ on the outcome variable rather than just demonstrate chance variation.  This is often done through homework assignments in which students receive statistical output and must write a short report explaining what results mean and backing up these claims by reporting the appropriate statistical output.  These activities provide students with the ability to advance support for their claims through the use of statistics.