Psych M.D. - Variable-Ratio Schedule


A B C D E F G H I J K L M N O P Q R S T U V W X Y Z


Term:Variable-ratio schedule
Definition (Myers): In operant conditioning, a schedule of reinforcement that reinforces a response after an unpredictable number of responses (p.327)
Definition (alternative): In operant conditioning, a variable-ration schedule is a schedule of reinforcement where a response is reinforced after an unpredictable number of responses. This schedule creates a high steady rate of responding. Gambling and lottery games are good examples of a reward based on a variable ratio schedule.
Contextual explanation:
There are four types of schedules one needs to understand in operant conditioning. There is a fixed-ratio schedule, variable-ratio schedule, fixed-interval schedule, and variable-interval schedule. The variable-ratio schedule is simple: receiving reinforcement of a response at random times. For instance, if I were a scientist, and I had a pigeon how could press a button for food to come out, I would actually give it food whenever I want, not whenever it presses it once, twice, or any set number of times. It could press the button once and food could come out, but the next time it might not.
Related terms and concepts: insert links to other pages on The Neuron here
insert related videos and photos here
Related websites: insert links to external websites here
  • http://psychology.about.com/od/vindex/g/def_variablerat.htm
  • http://www.betabunny.com/behaviorism/VRS.htm
  • http://chiron.valdosta.edu/whuitt/col/behsys/operant.html
Sources: insert WAPA style citations here
  1. Myers, D.G. (2004). Psychology. New York: New York:Worth Publishers.
  2. Huitt, W., & Hummel, J. (1997). An introduction to operant (instrumental) conditioning. Educational Psychology Interactive. Valdosta, GA: Valdosta State University. Retrieved November 11th from, http://chiron.valdosta.edu/whuitt/col/behsys/operant.html
Edited by: Susan #2
Date of last edit: 11/26/09



More pages