The most difficult intermittent schedule to extinguish is:

Study for the NCE Exam with our Purple Book Test. Practice with flashcards and multiple choice questions, each question has hints and explanations. Ace your exam!

Multiple Choice

The most difficult intermittent schedule to extinguish is:

Explanation:
Intermittent reinforcement schedules shape how stubborn a learned behavior becomes when rewards stop. A variable ratio schedule reinforces after an unpredictable number of responses, with the average around some target but never known in advance. That unpredictability makes the behavior unusually persistent: each response has a chance of reward, so the learner keeps responding in hopes of hitting the next reinforcement. Because the reward pattern isn’t predictable, there isn’t a clear signal that the reinforcement has ended, so extinction unfolds very slowly. A common real‑world illustration is gambling machines, which payout after a random number of plays; people keep pulling the lever even after occasional losses, because the next win could be just around the corner. Other intermittent schedules lack that same level of unpredictability. With a fixed ratio, reinforcement comes after a set number of responses, so the pattern is more predictable; when reinforcement stops, the organized response bursts diminish more quickly as the learner notices the change. With fixed or variable intervals, reinforcement is tied to time rather than a precise count of responses, producing characteristic response patterns that are less resistant to extinction than the high, steady rate produced by a variable ratio schedule. Because the unpredictable, response-driven reward in a variable ratio setup keeps the behavior going vigorously, it is the most difficult to extinguish.

Intermittent reinforcement schedules shape how stubborn a learned behavior becomes when rewards stop. A variable ratio schedule reinforces after an unpredictable number of responses, with the average around some target but never known in advance. That unpredictability makes the behavior unusually persistent: each response has a chance of reward, so the learner keeps responding in hopes of hitting the next reinforcement. Because the reward pattern isn’t predictable, there isn’t a clear signal that the reinforcement has ended, so extinction unfolds very slowly. A common real‑world illustration is gambling machines, which payout after a random number of plays; people keep pulling the lever even after occasional losses, because the next win could be just around the corner.

Other intermittent schedules lack that same level of unpredictability. With a fixed ratio, reinforcement comes after a set number of responses, so the pattern is more predictable; when reinforcement stops, the organized response bursts diminish more quickly as the learner notices the change. With fixed or variable intervals, reinforcement is tied to time rather than a precise count of responses, producing characteristic response patterns that are less resistant to extinction than the high, steady rate produced by a variable ratio schedule. Because the unpredictable, response-driven reward in a variable ratio setup keeps the behavior going vigorously, it is the most difficult to extinguish.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy