New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
behavior a contemporary
Introduction To Learning And Behavior 3rd Edition Russell A. Powell, Diane G. Symbaluk, P. Lynne Honey - Solutions
Defi ne the goal gradient effect and give an example.
What type of reinforcer serves to maintain behavior throughout the early links in a chain? What is the best way to establish responding on a chained schedule in animals?
What is a chained schedule? Diagram and label an example of a chained schedule.
What is a conjunctive schedule? What is an adjusting schedule?
Name and defi ne the two types of noncontingent schedules.
What are three types of response-rate schedules?
Name and defi ne two types of duration schedules.
Defi ne variable interval schedule. Describe the typical pattern of responding produced by this schedule.
Defi ne fi xed interval schedule. Describe the typical pattern of responding produced by this schedule.
Defi ne variable ratio schedule. Describe the typical pattern of responding produced by this schedule.
Defi ne fi xed ratio schedule. Describe the typical pattern of responding produced by this schedule.
Distinguish between continuous and intermittent schedules of reinforcement.
What is a schedule of reinforcement?
Given this state of affairs, how is the organism likely to distribute its activities?
Contingencies of reinforcement often (disrupt/enhance) _____________ the distribution of behavior such that it is (easy/impossible) _____________ to obtain the optimal amount of reinforcement.
According to the behavioral _____________ _____________ approach, an organism that (is forced to/can freely) _____________ engage in alternative activities will distribute its behavior in such a way as to (optimize/balance) _____________ the available reinforcement.
Kaily typically watches television for 4 hours per day and reads comic books for 1 hour per day. You then set up a contingency whereby Kaily must watch 4.5 hours of television each day in order to have access to her comic books. According to the Premack principle, this will likely be an
The response deprivation hypothesis differs from the Premack principle in that we need only know the baseline frequency of the (reinforced/reinforcing)________________ behavior.
If a child normally watches 4 hours of television per night, we can make television watching a reinforcer if we restrict free access to the television to (more/less)_____________ than 4 hours per night.
According to the response deprivation hypothesis, a response can serve as a reinforcer if free access to the response is (provided/restricted) _____________ and its frequency then falls (above/below) ___________ its baseline level of occurrence.
What is Grandma’s rule, and how does it relate to the Premack principle?
If Chew bubble gum → Play video games is a diagram of a reinforcement procedure based on the Premack principle, then chewing bubble gum must be a (lower/higher)_____________ probability behavior than playing video games.
If you drink fi ve soda pops each day and only one glass of orange juice, then the opportunity to drink ___________ can likely be used as a reinforcer for drinking ___________.
According to the Premack principle, if you crack your knuckles 3 times per hour and burp 20 times per hour, then the opportunity to _____________ can probably be used as a reinforcer for _____________.
The Premack principle states that a _____________ _____________ behavior can be used as a reinforcer for a _____________ _____________behavior.
The Premack principle holds that reinforcers can often be viewed as _____________ rather than stimuli. For example, rather than saying that the rat’s lever pressing was reinforced with food, we could say that it was reinforced with _____________ food.
Research has shown that hungry rats will perform more effectively in a T-maze when the reinforcer for a correct response (right turn versus left turn) consists of several small pellets as opposed to one large pellet (Capaldi, Miller, &Alptekin, 1989). Chickens will also run faster down a runway to
The motivation that is derived from some property of the reinforcer is called _____________ motivation.
A major problem with drive reduction theory is that _________________________ ________________________________________________________________.
According to this theory, a s_______________ reinforcer is one that has been associated with a p_________________ reinforcer.
According to drive reduction theory, an event is reinforcing if it is associated with a reduction in some type of p_______________ drive.
One suggestion for enhancing our behavior in the early part of a long response chain is to make the completion of each link more s_______________, thereby enhancing its value as a s_____________ reinforcer.
An effi cient way to train a complex chain, especially in animals, is through b___________ chaining, in which the (fi rst/last) _________ link of the chain is trained fi rst. However, this type of procedure usually is not required with verbally profi cient humans, with whom behavior chains can be
Responding tends to be weaker in the (earlier/later) _____________ links of a chain. This is an example of the g____________ g _____________ effect in which the strength and/or effi ciency of responding (increases/decreases) ____________ as the organism approaches the goal.
Within a chain, completion of each of the early links ends in a(n) s______________ reinforcer, which also functions as the _________________ for the next link of the chain.
A chained schedule consists of a sequence of two or more simple schedules, each of which has its own _____________ and the last of which results in a t____________ r___________________.
To the extent that a gymnast is trying to improve his performance, he is likely on a(n) _____________ schedule of reinforcement; to the extent that his performance is judged according to both the form and quickness of his moves, he is on a(n) _____________ schedule.
In a(n) _____________ schedule, the response requirement changes as a function of the organism’s performance while responding for the previous reinforcer, while in a(n) _____________ schedule, the requirements of two or more simple schedules must be met before the reinforcer is delivered.
A complex schedule is one that consists of _______________________________.
A child who is often hugged during the course of the day, regardless of what he is doing, is in humanistic terms receiving unconditional positive regard. In behavioral terms, he is receiving a form of non______________ social reinforcement. As a result, this child may be (more/less) ___________
In many mixed martial arts matches, each fi ghter typically receives a guaranteed purse, regardless of the outcome. In the Ultimate Fighter series, the winner of the fi nal match is awarded a major contract in the UFC while the loser receives nothing.As a result, Karo is not surprised when he
During the time that a rat is responding for food on a VR 100 schedule, we begin delivering additional food on a VT 60-second schedule. As a result, the rate of response on the VR schedule is likely to (increase/decrease/remain unchanged) _____________.
As shown by the kinds of situations in which superstitious behaviors develop in humans, such behaviors seem most likely to develop on a(n) (VT/FT) _____ schedule of reinforcement.
Herrnstein (1966) noted that superstitious behaviors can sometimes develop as a by-product of c_______________ reinforcement for some other behavior.
When noncontingent reinforcement happens to follow a particular behavior, that behavior may (increase/decrease) _____________ in strength. Such behavior is referred to as s_____________ behavior.
For farmers, rainfall is an example of a noncontingent reinforcer that is typically delivered on a ______________________ __________________________ schedule(abbreviated _________________).
Every morning at 7:00 A.M. a robin perches outside Marilyn’s bedroom window and begins singing. Given that Marilyn very much enjoys the robin’s song, this is an example of a ______________ ___________ 24-hour schedule of reinforcement(abbreviated __________).
On a non_____________ schedule of reinforcement, a response is not required to obtain a reinforcer. Such a schedule is also called a response i____________ schedule of reinforcement.
Frank discovers that his golf shots are much more accurate when he swings the club with a nice, even rhythm that is neither too fast nor too slow. This is an example of _____________ reinforcement of_____________ behavior (abbreviated ________).
On a video game, the faster you destroy all the targets, the more bonus points you obtain. This is an example of _____________ reinforcement of _______________ ___________ behavior (abbreviated ________).
In practicing the slow-motion form of exercise known as tai chi, Yang noticed that the more slowly he moved, the more thoroughly his muscles relaxed. This is an example of d______________ reinforcement of _____________ _____________ behavior (abbreviated ________).
As Tessa sits quietly, her mother occasionally gives her a hug as a reward. This is an example of a ______________ _____________ schedule.
On a (VD/VI) ___________ schedule, reinforcement is contingent upon responding continuously for a varying period of time; on an (FI/FD) ____________ schedule, reinforcement is contingent upon the fi rst response after a fi xed period of time.
In general, ______________ schedules produce postreinforcement pauses because obtaining one reinforcer means that the next reinforcer is necessarily quite (distant/close) ________________.
In general, (variable/fi xed) ________________ schedules produce little or no postreinforcement pausing because such schedules provide the possibility of relatively i_____________ reinforcement, even if one has just obtained a reinforcer.
On ________________ schedules, the reinforcer is largely time contingent, meaning that the rapidity with which responses are emitted has (little/considerable)_______________ effect on how quickly the reinforcer is obtained.
In general, (ratio/interval) __________________ schedules tend to produce a high rate of response. This is because the reinforcer in such schedules is entirely r_____________ contingent, meaning that the rapidity with which responses are emitted (does/does not) _____________ greatly affect how soon
In general, variable interval schedules produce a (low/moderate/high) _____________,(steady/fl uctuating) ___________________________ rate of response with little or no ___________________________________________________.
You fi nd that by frequently switching stations on your radio, you are able to hear your favorite song an average of once every 20 minutes. Your behavior of switching stations is thus being reinforced on a _____________ schedule.
On a variable interval schedule, reinforcement is contingent upon the _____________ response following a _____________, un__________ period of _____________.
On a pure FI schedule, any response that occurs (during/following) _____________ the interval is irrelevant.
Responding on an FI schedule is often characterized by a sc_____________ pattern of responding consisting of a p__________________ p________ followed by a gradually (increasing/decreasing) ____________ rate of behavior as the interval draws to a close.
In the example in question 2, I will probably engage in (few/frequent) _____________ glances at the start of the interval, followed by a gradually (increasing/decreasing)_____________ rate of glancing as time passes.
If I have just missed the bus when I get to the bus stop, I know that I have to wait 15 minutes for the next one to come along. Given that it is absolutely freezing out, I snuggle into my parka as best I can and grimly wait out the interval. Every once in a while, though, I emerge from my cocoon to
On a fi xed interval schedule, reinforcement is contingent upon the _____________ response following a _____________, pr____________ period of _____________.
As with an FR schedule, an extremely lean VR schedule can result in r___________ s___________.
An average of 1 in 10 people approached by a panhandler actually gives him money.His behavior of panhandling is on a _______ schedule of reinforcement.
A variable ratio schedule typically produces a (high/low) ____________ rate of behavior (with/without) _____________ a postreinforcement pause.
On a variable ratio schedule, reinforcement is contingent upon a _____________ un_____________ _____________ of responses.
Graduate students often have to complete an enormous amount of work in the initial year of their program. For some students, the workload involved is far beyond anything they have previously encountered. As a result, their study behavior may become increasingly (erratic/stereotyped) _____________
Over a period of a few months, Aaron changed from complying with each of his mother’s requests to complying with every other request, then with every third request, and so on. The mother’s behavior of making requests has been subjected to a procedure known as “s_____________ the
A very dense schedule of reinforcement can also be referred to as a very r_________ schedule.
An FR 12 schedule of reinforcement is (denser/leaner) _____________ than an FR 100 schedule.
The typical FR pattern is sometimes called a b_________-and-r________ pattern, with a ____________ pause that is followed immediately by a (high/low) ________ rate of response.
An FR 200 schedule of reinforcement will result in a (longer/shorter) ____________ pause than an FR 50 schedule.
A fi xed ratio schedule tends to produce a (high/low) _________ rate of response, along with a p_________________ p________.
An FR 1 schedule of reinforcement can also be called a ____________ schedule.
A mother fi nds that she always has to make the same request three times before her child complies. The mother’s behavior of making requests is on an ________ _____ schedule of reinforcement.
A schedule in which 15 responses are required for each reinforcer is abbreviated _____________.
On a(n) _____________ _____________ schedule, reinforcement is contingent upon a fi xed number of responses.
S_____________ e_____________ are the different effects on behavior produced by different response requirements. These are the stable patterns of behavior that emerge once the organism has had suffi cient exposure to the schedule. Such stable patterns are known as st_____________-st_____________
When the weather is very cold, you are sometimes unable to start your car. The behavior of starting your car in very cold weather is on a(n) _____________ schedule of reinforcement.
Each time you fl ick the light switch, the light comes on. The behavior of fl icking the light switch is on a(n) _____________ schedule of reinforcement.
On a c_____________ reinforcement schedule (abbreviated ______), each response is reinforced, whereas on an i_____________ reinforcement schedule, only some responses are reinforced. The latter is also called a p_____________ reinforcement schedule.
A s_____________ of reinforcement is the r_____________ requirement that must be met to obtain reinforcement.
A discriminative stimulus is a stimulus that signals that a ________________ is available. It is said to “___________________” for the behavior.
Major advantages of using the sound of a click for shaping are that the click can be delivered ____________ and the animal is unlikely to ____________ upon it.
An event is a reinforcer if it ________________ a behavior and the future probability of that behavior ________________.
Referring to this chapter’s opening vignette, among the four types of contingencies described in this chapter, Sally’s actions toward Joe probably best illustrate the process of ________________. In other words, Joe’s abusive behavior will likely (increase/decrease) ________________ in the
Classically conditioned behaviors are said to be __________________ by stimuli; operant behaviors are said to be ___________________ by the organism.
Steven has fond memories of his mother reading fairy tales to him when he was a child, and as a result he now enjoys reading fairy tales as an adult.For Steven, the act of reading fairy tales is functioning as what type of reinforcer? (A) primary, (B) secondary, (C) intrinsic, (D) extrinsic,
A reinforcer is usually given the symbol ________________, while a punisher is usually given the symbol ________________. The operant response is given the symbol ________________, while a discriminative stimulus is given the symbol ________________.
Events that are innately reinforcing are called ________________ reinforcers;events that become reinforcers through experience are called _____________ reinforcers.
A stimulus that signals that a response will be punished is called a ___________________ for punishment.
Operant behaviors are usually defi ned as a ________________ of responses, all of which are capable of producing a certain ________________.
Achieving a record number of strikeouts in a game would be a(n)(natural/contrived) _______________ reinforcer for pitching well; receiving a bonus for throwing that many strikeouts would be a(n) ________________ reinforcer.
When Beth tried to pull the tail of her dog, he bared his teeth and growled threateningly. Beth quickly pulled her hand back. The dog growled even more threateningly the next time Beth reached for his tail, and she again pulled her hand away. Eventually Beth gave up, and no longer tries to pull the
A generalized secondary reinforcer is one that has become a reinforcer because it has been associated with (A) a primary reinforcer, (B) a secondary reinforcer, (C) several secondary reinforcers, (D) several primary reinforcers, or (E) several reinforcers (either primary or secondary). ____________
According to Thorndike’s ______________________, behaviors that lead to a ________________ state of affairs are strengthened, whereas behaviors that lead to an _______________ state of affairs are weakened.
Harpreet very much enjoys hard work and often volunteers for projects that are quite demanding. According to ___________________ theory, it is likely the case that, for Harpreet, the act of expending a lot of effort has often been ________________.
Showing 300 - 400
of 1052
1
2
3
4
5
6
7
8
9
10
11
Step by Step Answers