New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
methods behavioral research
Introduction To Learning And Behavior 4th Edition Russell A. Powell, P. Lynne Honey, Diane G. Symbaluk - Solutions
4. When Alex held the car door open for Stephanie, she made a big fuss over what a gentleman he was becoming. Alex no longer holds the car door open for her.The consequence for holding open the door was the of a stimulus, and the behavior of holding open the door subsequently in frequency;
3. When Alex burped in public during his date with Stephanie, she got angry with him. Alex now burps quite often when he is out on a date with Stephanie. The consequence for burping was the of a stimulus, and the behavior of belching subsequently in frequency; therefore, this is an example of .
2. Whenever Sasha pulled the dog’s tail, the dog left and went into another room. As a result, Sasha now pulls the dog’s tail less often when it is around. The consequence for pulling the dog’s tail was the (presentation/removal) of a stimulus, and the behavior of pulling the dog’s tail
1. When Sasha was teasing the dog, it bit her. As a result, she no longer teases the dog. The consequence for Sasha’s behavior of teasing the dog was the(presentation/removal) of a stimulus, and the teasing behavior subsequently (increased/decreased) in frequency; therefore, this is an example of
5. Turning down the heat because you are too hot is an example of an (escape/avoidance) response; turning it down before you become too hot is an example of an (escape/avoidance) response.
4. With respect to escape and avoidance, an response is one that terminates an aversive stimulus, while an response is one that prevents an aversive stimulus from occurring. Escape and avoidance responses are two classes of behavior that are maintained by (positive/negative)reinforcement.
3. Karen cries while saying to her boyfriend, “John, I don’t feel as though you love me.” John gives Karen a big hug saying, “That’s not true, dear, I love you very much.” If John’s hug is a reinforcer, Karen is (more/less)likely to cry the next time she feels insecure about her
2. When the dog sat at your feet and whined during breakfast one morning, you fed him. As a result, he sat at your feet and whined during breakfast the next morning. The consequence for the dog’s whining consisted of the (presentation/removal) of a stimulus, and his behavior of whining
1. When you reached toward the dog, he nipped at your hand. You quickly pulled your hand back. As a result, he now nips at your hand whenever you reach toward him. The consequence for the dog’s behavior of nipping consisted of the (presentation/removal) of a stimulus (namely, your hand), and his
4. Reinforcement is related to a(n) (increase/decrease) in behavior, whereas punishment is related to a(n) (increase/decrease) in behavior.
3. Within the context of reinforcement and punishment, positive refers to the(addition/subtraction) of something, and negative refers to the(addition/subtraction) of something.
2. The word positive, when combined with the words reinforcement or punishment,(does / does not) mean that the consequence is good or pleasant.Similarly, the term negative, when combined with the words reinforcement or punishment, (does / does not) mean that the consequence is bad or unpleasant.
1. The word positive, when combined with the words reinforcement or punishment, means only that the behavior is followed by the of something. The word negative, when combined with the words reinforcement or punishment, means only that the behavior is followed by the of something.
9. A bell that signals the start of a round and therefore serves as an SD for the operant response of beginning to box may also serve as a(n) (SD/CS) for a fear response. This is an example of how the two processes of conditioning and conditioning often overlap.
8. A stimulus in the presence of which a response is punished is called a for . It can be given the symbol .
7. Another way of thinking about the three-term contingency is that you something, something, and something.
6. The three-term contingency can also be thought of as an ABC sequence, where A stands for event, B stands for , and C stands for .
5. Using the appropriate symbols, label each component in the following three-term contingency (assume that the behavior will be strengthened):Phone rings: Answer phone ã Conversation with friend
4. A discriminative stimulus (does / does not) elicit behavior in the same manner as a CS.
3. A discriminative stimulus is said to “ for the behavior,” meaning that its presence makes the response (more/less) likely to occur.
2. A discriminative stimulus is usually indicated by the symbol .
1. The operant conditioning procedure usually consists of three components:QUICK QUIZ E(1) a d s , (2) an o response, and(3) a c .
13. Clayton stopped plugging in the toaster after he received an electric shock while doing so. This is an example of (punishment/extinction) .Manzar stopped using the toaster after it no longer made good toast. This is an example of .
12. Weakening a behavior through the withdrawal of reinforcement for that behavior is known as .
11. When we chastise a child for being rude, are we attempting to punish: (a) the child who was rude or (b) the child’s rude behavior?
10. When we give a dog a treat for fetching a toy, are we attempting to reinforce:(a) the behavior of fetching the toy or (b) the dog that fetched the toy)?
9. When labeling an operant conditioning procedure, punishing consequences (punishers) are given the symbol (which stands for ), while reinforcing consequences (reinforcers) are given the symbol (which stands for ).The operant response is given the symbol .
8. Each time Edna talked out in class, her teacher immediately came over and gave her a hug. As a result, Edna no longer talks out in class. By definition, the hug is a(n) because the behavior it follows has (increased/decreased) in frequency.
7. When Moe stuck his finger in a light socket, he received an electric shock. As a result, he now sticks his finger in the light socket as often as possible. By definition, the electric shock was a because the behavior it followed has (increased/decreased) in frequency.
6. Reinforcers and punishers are defined entirely by their on behavior. For this reason, the term reinforcer is often preferred to the term because the latter is too closely associated with events that are commonly regarded as pleasant or desirable.
5. Eliminating a dog’s tendency to jump up on visitors by scolding her when she does so is an example of , while the scolding itself is a .
4. Strengthening a roommate’s tendency toward cleanliness by thanking her when she cleans the bathroom is an example of , while the thanks itself is a .
3. The terms reinforcement and punishment refer to the pr or pr whereby a behavior is strengthened or weakened by its consequences.
2. More specifically, a reinforcer is a consequence that (precedes/follows) a behavior and (increases/decreases) the probability of that behavior. A punisher is a consequence that (precedes/follows) a behavior and (increases/decreases) the probability of that behavior.
1. Simply put, reinforcers are those consequences that s a behavior, while punishers are those consequences that w a behavior.
6. Operant behavior is usually defined as a(n) of responses rather than a specific response.
5. Operant responses are also simply called .
4. Classically conditioned behaviors are said to be e by the stimulus, while operant behaviors are said to be e by the organism.
3. The process of operant conditioning involves the following three components:(1) a r that produces a certain , (2) a c that serves to either increase or decrease the likelihood of the that preceded it, and (3) a d stimulus that precedes the and signals that a certain is now available.
2. Operant conditioning is similar to the principle of natural selection in that an individual’s behaviors that are (adaptive/nonadaptive) tend to increase in frequency, while behaviors that are tend to decrease in frequency.
1. Skinner’s definition of operant conditioning differs from Thorndike’s law of effect in that it views consequences in terms of their effect upon the strength of behavior rather than whether they are s ing or a ing.
7. Skinner originally thought all behavior could be explained in terms of , but he eventually decided that this type of behavior could be distinguished from another, seemingly more voluntary type of behavior known as behavior.
6. Skinner’s procedures are also known as fr o procedures in that the animal controls the rate at which it earns food.
5. In the original version of the Skinner box, rats earn food by p a l ; in a later version, pigeons earn a few seconds of access to food by p at an illuminated plastic disc known as a.
4. The Skinner box evolved out of Skinner’s quest for a procedure that would, among other things, yield (regular/irregular) patterns of behavior.
3. According to Thorndike, behaviors that worked were st i , while behaviors that did not work were st o .
2. Based on his research with cats, Thorndike formulated his famous of , which states that behaviors that lead to a(n) state of affairs are strengthened, while behaviors that lead to a(n) state of affairs are weakened.
1. Thorndike’s cats learned to solve the puzzle box problem (gradually/suddenly) .
3. Another name for operant conditioning is conditioning.
2. Elicited behavior is a function of what (precedes/follows)it; operant behavior is a function of what (precedes/follows) it.
1. Operant behaviors are influenced by their .QUICK QUIZ A
16. Diagram an example of a classical conditioning procedure that results in an alteration (strengthening or weakening) of immune system functioning. Diagram an example of a classical conditioning process involved in the creation of a placebo effect. Be sure to label each component with the
15. Define aversion therapy. What is covert sensitization?
14. Define flooding. Be sure to mention the underlying process by which it is believed to operate. Also, what is the distinction between imaginal and in vivo versions of flooding?
13. Outline the three components of systematic desensitization.
12. What is counterconditioning? Name and define the underlying process.
11. What would be the likelihood of a child who had very little control over important events in her life of later acquiring a phobia (compared to a child who had more control over important events)? Also, describe how US revaluation can affect the acquisition of a phobia and give an example.
10. Describe how selective sensitization and incubation can affect the acquisition of a phobia.
9. Describe how temperament and preparedness can affect the acquisition of a phobia. Be sure that your answer clearly indicates the difference between them.
8. Assuming that the look of fear in others can act as a US, diagram an example of observational learning in the acquisition of a phobia. Be sure to include the appropriate abbreviations (NS, US, etc.).
7. Briefly describe the Watson and Rayner experiment with Little Albert and the results obtained.
6. Describe the overexpectation effect and how the Rescorla-Wagner theory accounts for it.
5. Describe the Rescorla-Wagner theory. Describe how the RescorlaWagner theory accounts for overshadowing and blocking.
4. Describe the compensatory-response model of conditioning. How does the compensatory-response model account for drug overdoses that occur when an addict seems to have injected only a normal amount of the drug?
3. Describe the preparatory-response theory of conditioning.
2. Describe stimulus-substitution theory. What is the major weakness of this theory?
1. Distinguish between S-R and S-S models of conditioning.
3. Supporting the possibility that placebo effects are classically conditioned responses, such effects are more likely to occur (following/preceding)a period of treatment with the real drug. As well, repeated presentations of the placebo by itself tends to (reduce/increase) its effectiveness, which
2. Diagram the classical conditioning process in Ader and Cohen’s (1975) study of immunosuppression. Label each component using the appropriate abbreviations.
1. When Christopher entered his friend’s house, he noticed a dog dish beside the door. He soon began experiencing symptoms of asthma and assumed that the house was filled with dog dander (particles of fur or skin), to which he is allergic. Only later did he discover that his friend’s children
5. Aversion therapy is sometimes carried out using stimuli rather than real stimuli. This type of treatment procedure is known as sensitization.
4. In general, aversion therapy is (more/less) effective when the unpleasant response that is elicited is biologically relevant to the problematic behavior.
3. A highly effective procedure for reducing cigarette consumption, at least temporarily, is r____________
2. A standard treatment for alcoholism is to associate the taste of alcohol with feelings of n that have been induced by consumption of an e .
1. In therapy, one attempts to reduce the attractiveness of an event by associating that event with an unpleasant stimulus.
5. Öst’s single-session procedure combines the gradualness of s d with the prolonged exposure time of f . This procedure also makes use of p m , in which the therapist demonstrates how to interact with the feared object.
4. Modern-day therapies for phobias are often given the general name of e -b treatments.
3. For flooding therapy to be effective, the exposure period must be of relatively(long/short) duration.
2. Two types of flooding therapy are flooding in which one visualizes the feared stimulus, and flooding in which one encounters a real example of the feared stimulus.
1. In flooding therapy, the avoidance response is (blocked/facilitated) , thereby providing maximal opportunity for the conditioned fear to .
6. One bit of evidence against the counterconditioning explanation for this type of treatment is that relaxation (is / is not) always necessary for the treatment to be effective. On the other hand, in keeping with the counterconditioning explanation, relaxation does seem to facilitate treatment
5. Wolpe’s procedure is very effective with people who have (few/many) phobias that are highly (general/specific) .Thus, this procedure (does / does not) work well with people who have a social phobia.
4. A version of Wolpe’s procedure that uses real-life rather than imaginary stimuli is called . A major advantage of this procedure is that there is less worry about whether the treatment effect will g to the real world.
3. The three basic components of Wolpe’s procedure are:
2. Mary Cover Jones used the stimulus of to counter Peter’s feelings of anxiety, while Wolpe, in his s d procedure, used .
1. Associating a stimulus that already elicits one type of response with an event that elicits an incompatible response is called c . Wolpe believed that the underlying process is r i in which certain types of responses are (compatible/incompatible) with each other, and the occurrence of one type
4. The process of s s refers to an increase in one’s reactivity to a potentially fearful stimulus following exposure to a stressful event, even though the stressful event is (related/unrelated) to the feared stimulus.
3. According to the concept of revaluation, phobic behavior might sometimes develop when the person encounters a (more/less)intense version of the (CS/US) than was used in the original conditioning. This process can also occur through o l or through v transmitted information.
2. Brief exposures to a feared CS in the absence of the US may result in a phenomenon known as in which the conditioned fear response grows(stronger/weaker) . This runs counter to the general principle that presentation of the CS without the US usually results in e .
1. We will probably be (more/less) susceptible to acquiring a condiQUICK QUIZ H tioned fear response if we grow up in a world in which we experience little or no control over the available rewards.
5. The fact that many people are more petrified of encountering snakes than they are of being run over by cars, even though the latter is a far more relevant danger in the world in which they live, reflects differences in for acquiring certain kinds of fears.
4. Travis rolled his pickup truck, yet he had no qualms about driving home afterwards; Cam was in a minor fender bender last week and remained petrified of driving for several days afterward. These different outcomes may reflect inherited differences in t between the two individuals.
3. The concept of p holds that we are genetically programmed to acquire certain kinds of fears, such as fear of snakes and spiders, more readily than other kinds.
2. The term refers to an individual’s genetically determined level of emotionality and reactivity to stimulation. It (does / does not)seem to affect the extent to which responses can be classically conditioned.
1. From a conditioning perspective, viewing a display of fear in others can be conceptualized as a(n) stimulus that elicits a(n)response of fear in oneself. The event the other person is reacting to might then become a(n) stimulus that elicits a(n) response of fear in oneself.
6. Albert’s fear response was (present/absent) whenever he was sucking his thumb, which suggests that the fear conditioning was actually relatively(strong/weak) .
5. Unlike real-life phobias, Albert’s fear of the rat seemed to grow (stronger/weaker) following a 30-day break.
4. One difference between Albert’s fear conditioning and conditioning of real-life phobias is that the latter often require (only one / more than one)conditioning trial.
3. Albert’s startle response to the noise was a(n) response, while his crying in response to the rat was a(n) response.
2. In the Little Albert experiment, the rat was originally a(n) stimulus, while the loud noise was a(n) stimulus.
Showing 900 - 1000
of 1493
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Step by Step Answers