New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
methods behavioral research
Introduction To Learning And Behavior 4th Edition Russell A. Powell, P. Lynne Honey, Diane G. Symbaluk - Solutions
2. According to Mowrer, avoidance is the result of two distinct processes: (1)conditioning of a response, and (2) conditioning in which an avoidance response is n r by a reduction in .
1. It is relatively easy to understand the process underlying (escape/avoidance) conditioning because the organism moves from an situation to a non situation. By contrast, it is more difficult to understand conditioning because the organism moves from a(n) situation to another situation.
4. In the shuttle avoidance procedure described previously, the rat first learns to from the shock, with the acting as the SD for the behavior. The rat later learns to the shock, with the acting as the SD for the behavior.
3. Julio initially takes vitamin C whenever he has a cold, in the hope that it will shorten the duration of his symptoms. Feeling that this is effective, he begins taking it daily in the hope that it will keep him from getting a cold. Julio initially took the vitamin C to (avoid/escape) the
2. Typically, one first learns to from an aversive stimulus, and then to it.
1. Behavior that terminates an aversive stimulus is called behavior, QUICK QUIZ A whereas behavior that prevents an aversive stimulus from occurring is called behavior.
18. How might a bird owner use stimulus control to eliminate a parrot’s tendency to squawk for long periods of time? How might a novelist use stimulus control to facilitate the act of writing?
17. Describe errorless discrimination training and the two basic aspects of this procedure. What is a major drawback of such training?
16. Define anticipatory contrast and give an example.
15. Define positive and negative contrast effects, and give an example of each.
14. Define a multiple schedule. Diagram an experimental example involving the response of lever pressing for food on an FR 20 and VI 30-sec schedule, and the stimuli of tone and light. Be sure to include the appropriate label for each component (SD, etc.).
13. Define the peak shift effect. Illustrate your answer with a graph of a generalization gradient.
12. What is a generalization gradient? How does the shape of the gradient reflect the degree of generalization?
11. What is an S? Diagram an example of a discrimination training procedure(be sure to include the appropriate abbreviations for each component).
10. Define stimulus generalization and stimulus discrimination as they occur in operant conditioning.
9. Define stimulus control. What would be an example of stimulus control of behavior at a hockey game and at a church service?
8. Define a DRO procedure. To eliminate a behavior, why is a DRO procedure more effective than a straight extinction procedure?
7. What is spontaneous recovery, and how is it affected by successive sessions of extinction?
6. How is resistance to extinction affected by history of reinforcement, magnitude of reinforcement, degree of deprivation, and previous experience with extinction?
5. Define the partial reinforcement effect. Of the four basic intermittent schedules, which produces particularly strong resistance to extinction?
4. What is resistance to extinction? Be sure to distinguish between low resistance and high resistance to extinction.
3. What are four side effects of extinction, other than extinction burst and resurgence?
2. What is an extinction burst? What is resurgence?
1. Define extinction as it applies to operant conditioning. Be sure to distinguish between the process of extinction and the procedure of extinction.
3. Briefly put, six rules for overcoming sleep-onset insomnia through the use of stimulus control are (chances are that you will have to check back to fill these out):_______
2. Jaclyn’s cat has a terrible habit of jumping up on the kitchen counter whenever Jaclyn is preparing food. How might Jaclyn use a stimulus control procedure to eliminate this behavior?
1. Training a rhinoceros to touch the end of a stick with its nose is an example of a useful behavior management technique called t
4. Gradually altering the intensity of a stimulus is called f .
3. This type of discrimination training is also likely to produce behavior patterns that are (easy/difficult) to modify at a later point in time.
2. This type of discrimination training is likely to produce (more/less)emotional behavior compared to the standard form of discrimination training.
1. In e discrimination training, the S is presented (early/later)in the training procedure, and at very (weak/strong)intensity to begin with.
3. Vronsky (another character in Tolstoy’s Anna Karenina) falls deeply in love with Anna, who is the wife of another man. For several months, they carry on a passionate affair. When Anna, however, finally leaves her husband to be with him, Vronsky finds that he soon becomes bored with their
2. If Jackie hears her mother say that it is getting close to her bedtime, she is likely to become (more/less) involved in the computer game she is playing.
1. An increase in the rate of responding for an available reinforcer when faced with the possibility of losing it in the near future is known as contrast.
4. When Levin (a lonely bachelor in Tolstoy’s novel Anna Karenina) proposed to the beautiful young Kitty, she rejected him. Levin was devastated and decided to devote the rest of his life to his work. Kitty, in turn, was subsequently rejected by the handsome young military officer, Vronsky, whom
3. A pigeon that experiences a shift from a multiple FR 10 VI 60-sec schedule to a multiple FR 100 VI 60-sec schedule will likely (increase/decrease) its rate of response on the VI 60-sec component.
2. In behavioral contrast, a decrease in reinforcement on one alternative results in a(n) in on the other alternative.
1. In behavioral contrast, an increase in reinforcement on one QUICK QUIZ J alternative results in a(n) (increase/decrease) in (responding/reinforcement) on the other alternative.
3. On a multiple VI 50-sec VR 50 schedule, we are likely to find a high rate of response on the (VI/VR/both) component(s).
2. This type of schedule differs from a chained schedule in that a is provided after each component schedule is completed.
1. On a m schedule, two or more schedules are presented (sequentially/QUICK QUIZ I simultaneously) , with each resulting in a r and having its own distinctive .
3. If a pigeon undergoes discrimination training in which a yellow key light is explicitly established as an S and an orange key light is explicitly established as the SD, the strongest response in the generalization gradient will likely be to a(yellowish-orange/orange/orange-reddish) key light.
2. If an orange key light is trained as an SD in a key pecking task with pigeons, and the pigeons are then exposed to other key colors ranging from yellow on one end of the continuum to red on the other (with orange in the middle), then the peak of the generalization gradient will likely be to a
1. In the peak shift effect, the peak of a generalization gradient, following d t , shifts away from the to a stimulus that is further removed from the .
2. An “Open for Business” sign is an (use the abbreviation) for entering the store and making a purchase, while a “Closed for Business” sign is an for attempting such behavior.
1. In a discrimination training procedure, responses that occur in the presence of QUICK QUIZ G the (use the symbols) are reinforced, while those that occur in the presence of the are not reinforced. This latter stimulus is called a d s for e
7. Jonathan always goes for lunch around 12:30, with the range being somewhere between 12:25 and 12:35 p.m. The generalization gradient for this behavior across various points in time would therefore be much (steeper/flatter) than if the range was between 12:00 and 1:00. This indicates a pattern of
6. When Jonathan looked at his watch and noticed that it was 12:30 p.m., he decided that it was time for lunch. Jonathan’s eating behavior appears to be under strong s c .
5. In a graph that depicts a g g , a relatively flat line indicates more and less . A relatively steep line indicates more and less .
4. A g g indicates the strength of responding to stimuli that vary along a continuum.
3. In general, stimuli that are (more/less) similar produce stronger generalization.
2. In operant conditioning, the term s g refers to the tendency for a response to be emitted in the presence of stimuli that are similar to the original . The opposite process, called s d refers to the tendency for the response to be emitted more in the presence of one stimulus than another.
1. A behavior is said to be under s c when it is highly likely to occur in the presence of a certain stimulus.
3. A DRO procedure is useful in that it tends to reduce many of the side effects of extinction, such as ex b and fr .
2. Giving a dog a treat whenever it does something other than jump up on visitors as they enter the house is an example of a (use the abbreviation)procedure.
1. The procedure of reinforcing all behaviors except the particular target behavior that you wish to extinguish is known as d r of o behavior (abbreviated ).
3. Skinner believed that this phenomenon is a function of that are uniquely associated with the start of the session.
2. In general, each time this occurs, the behavior is (weaker/stronger)than before and extinguishes (more/less) readily.
1. S is the reappearance of an extinguished response at a later point in time.
7. Previous experience with extinction, as well as a distinctive signal for extinction, tends to produce a(n) (increase/decrease) in resistance to extinction.
6. In general, there is a(n) (direct/inverse) relationship between resistance to extinction and the organism’s level of deprivation for the reinforcer.
5. Resistance to extinction is generally greater when the behavior that is being extinguished has been reinforced with a (high/low) -magnitude reinforcer, though the opposite effect has also been found.
4. In general, a behavior that has been reinforced many times is likely to be (much easier / more difficult) to extinguish.
3. Among the four basic intermittent schedules, the (use the abbreviation)schedule is particularly likely to produce strong resistance to extinction.
2. According to the p r effect, responses that have been maintained on an intermittent schedule will show (more/less)resistance to extinction than responses that have been reinforced on a continuous schedule.
1. R to is the extent to which responding persists after an extinction procedure is implemented.
5. On the trip home, Krissy, who never did get a toy, sat silently and stared out the window. This is not surprising, because extinction is sometimes followed by a temporary period of d __________.
4. When her father still refuses to buy her a toy, Krissy suddenly asks her dad to pick her up and carry her, something she has not asked for since she was much smaller. This could be an example of r or what psychoanalysts call r ___________.
3. Krissy might also begin showing a lot of e behavior, including a __________.
2. Krissy is also likely to ask for the toy in many different ways because extinction often results in an increase in the v of a behavior.
1. Krissy asked her father to buy her a toy, as he usually did, when they were out shopping. Unfortunately, Krissy’s father had spent all of his money on building supplies and told her that he had nothing left for a toy. The first thing that might happen is that Krissy will (increase/decrease)
3. In carrying out an extinction procedure, an important first step is to ensure that the consequence being withdrawn is in fact the .
2. Whenever Jana’s friend Karla phoned late in the evening, she would invariably begin complaining about her coworkers. In the beginning, Jana listened attentively and provided emotional support. Unfortunately, Karla started phoning more and more often, with each call lasting longer and longer.
1. Extinction is the of a previously response, the result of which is a(n) in the strength of that response.
17. Outline the response deprivation hypothesis. Describe how the response deprivation hypothesis differs from the Premack principle.
16. Outline the Premack principle. Give an example of the Premack principle as applied to dealing with a classroom situation in which students are chatting to each other rather than focusing on their work.
15. Describe the drive reduction theory of reinforcement. What is a major difficulty with this theory? What is incentive motivation?
14. Define the goal gradient effect and give an example.
13. What type of reinforcer serves to maintain behavior throughout the early links in a chain? What is the best way to establish responding on a chained schedule in animals?
12. What is a chained schedule? Diagram and label an example of a chained schedule.
11. What is an adjusting schedule? In what way does shaping involve the use of an adjusting schedule?
10. What is a conjunctive schedule? How does a conjunctive schedule differ from a chained schedule?
9. Name and define the two types of noncontingent schedules.
8. What are three types of response-rate schedules?
7. Name and define two types of duration schedules.
6. Define variable interval schedule. Describe the typical pattern of responding produced by this schedule.
5. Define fixed interval schedule. Describe the typical pattern of responding produced by this schedule.
4. Define variable ratio schedule. Describe the typical pattern of responding produced by this schedule.
3. Define fixed ratio schedule. Describe the typical pattern of responding produced by this schedule.
2. Distinguish between continuous and intermittent schedules of reinforcement.
1. What is a schedule of reinforcement?
3. Given this state of affairs, how is the organism likely to distribute its activities?
2. Contingencies of reinforcement often (disrupt/enhance) the distribution of behavior such that it is (easy/impossible) to obtain the optimal amount of reinforcement.
1. According to the behavioral approach, an organism QUICK QUIZ Q that (is forced to / can freely) engage in alternative activities will distribute its behavior in such a way as to (optimize/balance) the available reinforcement.
4. Kaily typically watches television for 4 hours per day and reads comic books for 1 hour per day. You then set up a contingency whereby Kaily must watch 4.5 hours of television each day in order to have access to her comic books.According to the Premack principle, this will likely be an
3. The response deprivation hypothesis differs from the Premack principle in that we need only know the baseline frequency of the (reinforced/reinforcing)behavior.
2. If a child normally watches 4 hours of television per night, we can make television watching a reinforcer if we restrict free access to the television to (more/less) than 4 hours per night.
1. According to the response deprivation hypothesis, a response can serve as a QUICK QUIZ P reinforcer if free access to the response is (provided/restricted) and its frequency then falls (above/below) its baseline level of occurrence.
6. What is Grandma’s rule, and how does it relate to the Premack principle?
5. If “Chew bubble gum ã Play video games” is a diagram of a reinforcement procedure based on the Premack principle, then chewing bubble gum must be a(lower/higher) probability behavior than playing video games.
Showing 700 - 800
of 1493
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Step by Step Answers