A recent post to the SClistserv (which you should join if you haven’t already) asks how the frequency multiplier works in the case of 0 or no count frequency. The answer reinforces many of the four reasons I have advanced in the discussion of which convention Precision Teachers should adopt to represent 0 on the chart. The convention I advocate: place acceleration (dot) or deceleration (X) data at a x2 distance below time bar (Graf & Lindsley, 2002). The reasons for adopting the convention follow:

1. Graphical communication of 0 observed or detected instances of the pinpoint clearly represented by using same symbol for acceleration and deceleration behavior.

2. Allows measurement of “change measures” such as celeration, bounce, frequency multipliers, celeration multipliers, and Accuracy Improvement Measure (AIM).

3. Facilitates consistent, accurate measurement of “change measures” such as celeration, bounce, frequency multipliers, celeration multipliers, and Accuracy Improvement Measure (AIM).

4. Going from or to 0 (zero) is a big deal.

Let’s look closely at each reason.

1. Better graphical communication with the “Zero at a x2 distance below time bar.”

Look at the figure below. Precision Teaching has conventions for acceleration and deceleration data that everyone agrees on. A dot for acceleration data and an X for deceleration frequency. Looking at the two charted data sets we have one set with congruence and equality, all the symbols represent acceleration data (the four dots going from 0, 1, 2, to 3). But with the ? symbol, we now have incongruence. Three symbols represent acceleration data and another symbol that requires explanation.

Try the following exercise with the basic lesson we learned from Sesame Street:

• • • ?

One of these things is not like the others,

One of these things just doesn’t belong,

Can you tell which thing is not like the others,

By the time I finish my song?

How can the wisdom of Sesame Street possibly lead us astray?

2. Allows measurement of Precision Teaching change measures.

In the measurement world of Precision Teaching, change measures (e.g., celeration, bounce, frequency multiplier) are the proverbial yardsticks with which progress is ascertained. Furthermore, the significance of the measured data also falls in the domain of the quantified change measure. Many people immediately abandon typical equal interval graphs once they embrace PT change measures – now they have an quantifiable value with which to understand the world. Take celeration, a line that state how much a range of frequencies grew across a time period. A series acceleration data the grew at x2.0. That means the measures quantities changed from 10 to 20 by the end of the week. A significant amount of growth!

Some change measures like the frequency multiplier require measuring from one data point to the next. Without a proper 0 convention some people will ignore the change measure or measure it incorrectly (using a second data point that is a value other than 0).

3. Consistent, accurate measurement of Precision Teaching change measures.

As previously mentioned, when people use the ? convention, we have trouble with things like frequency multipliers. Going from dot to dot or X from X makes sense but not from ? to dot or X. Therefore, some people who use the ? symbol do not calculate frequency multipliers correctly.

Look at the figure below. If we use a Finder, we can quickly work out the multiplier. As an example, a 1 minute counting time has 0 corrects. The next frequency has 2 corrects. Moving from 1 to 2 we see the distance at x2.0. The frequency multiplier says the second frequency jumped up x2.0 or doubled from the first frequency.

Placing the 0 at a x 2.0 distance below the time bar with Finder has the dot on the .5 line. Measuring to the 2 yields a x4.0 frequency multiplier. The behavior has quadrupled or jumped up 4 times. The math also works out if you plug it into your formula (.5 x 4 = 2).

At this point you might ask how does a behavior going from 0 to 2 equal a x4.0 change? 0 x 2 = 0 not 4. The previous math doesn’t lie, but on the Standard Celeration Chart 0 does not exist. Therefore, we use a special convention to handle a zero count frequency. By adopting the “Zero at a x2 distance below time bar” the 0 will assume a value that we can then calculate. So zero for the 1 minute time bar means 0 takes on the value of .5. If we had a time bar at 30 seconds, the time bar would rest on the 2 frequency line. Zero for a time bar on the 2 frequency line would then be placed on the 1 frequency line (i.e., Zero at a x2 distance below 30 second time bar means 2 ÷ 2 = 1).

Why a x2.0 distance below the time bar? The answer brings us to the next point:

4. Going from or to 0 (zero) is a big deal.

Consider we have a child that can say 1 letter sound correctly. Going from 1 to 2 letter sounds demonstrates a frequency multiplier of x 2, a big deal! But from 0 to 2 we have a frequency multiplier of x4.0. Why the difference curious people want to know? Well, going from 1 to 2 means the behavior just doubled, a large feat of behavior change. However, going from 0 to 2 means we had the absence of behavior to the presence of behavior, an even bigger deal! A child that can’t say any letter sounds then says 2 letter sounds? Whoa!

The genesis of behavior, or going from nothing to something, should garner our appreciation and awe. In fact Ogden Lindsley (the founder of Precision Teaching) mused that we might have zero placed at a x3.0 distance below the time bar because he felt it was such an astonishing change.

What about the other direction? We can go from nothing to something but what about going from something to nothing? A student that calls out in class has a behavior that represents a deceleration target. A teacher that sees a student go from 2 to 1 talk outs has witnessed a ÷2.0 jump down in frequency. But what about the student that goes from 2 to no count frequency of 0? A jump down of ÷4.0! In Og’s words: “Performance lives in the multiply world – grows and decays by multiplying and dividing. When performance drops from 1 to zero it drops out of the multiply world where it can be reinforced and accelerated” (Lindsley, 2000 August 16).

If we care about reinforcing behavior, then it must come into existence for us to apply reinforcement to it. On the hand, if we want a behavior not to come into contact with reinforcement we must move it out of existence. Both achievements require extraordinary effort and we must recognize those instances as an empirical marvel.

The convention for zero (0): “Zero at a x2 distance below time bar.”

Let’s use it!

**References**

Graf, S., & Lindsley, O. (2002). Standard celeration charting 2002. Youngstown, OH: Graf Implements.

Lindsley, O. L. (2000, August 16). Re:Plotting celeration lines-Zero line [Electronic mailing list message]. Retrieved from http://lists.psu.edu/archives/sclistserv.html

“Zero” is the same as “Did Not Count Any,” “None counted,” or “None Observed.” In some ways it has some similarity to a “No Chance Day.” A 0 would be a “None Counted Day.”

While I think that your proposed convention is reasonable, and that you have made a great case for it, and whereas I will support it, we always have to remember that we cannot multiply out of 0.

If the Record Floor is at 1 minute, and no behaviors were observed on Monday, but 2 were observed for 1 minute on Tuesday, then, really, there cannot be a frequency multiplier mathematically, even though we can suggest that for pragmatic purposes we can adopt the logic above and say that it’s x4.0.

0 times any number leaves us at … well, at 0.

That’s the curse of zero.

It’s not a counting number.

To count things you start at 1. If you want to count how many students are in class, you look at the first one, count “1” and then at the next one, say “2,” and so on. There’s no need to say “0” before you look at the first student.

But, in terms of charting 0, your argument, Rick, seems much better to me than some of the other one’s I’ve seen, read, or heard. I also have ditched the ? mark, though understand the reasoning behind why it was suggested in the first place: The notion that had the Floor been lowered, we might have picked up a frequency. That’s a state of being known as “Maybe.” Also, if the dot or x is placed consistently at x2 below the Floor, then that has the advantage of adding consistency and would make for a practical convention.

So, for what it’s worth, I’ll back your effort here, without qualifications. You have my vote.

We need to move on from “How to Chart 0 and Why.”

— JE

Good points John. I use the term “no count frequency” to mean a person or machine observed a behavior and did not detect any instances of the behavior. I have used “zero frequency” before but for some reason that term doesn’t sit right with me.

No chance day and ignored day have different meanings and we treat each differently on the chart.

And the problem of zero has made for quite a few interesting debates among mathematicians. I agree with those mathematicians that categorize zero as a natural number. As you know, natural numbers are counting numbers. So I fall in the group of people that include all the non-negative integers {0, 1, 2, 3, 4…} as counting numbers (natural numbers).

When we adopt a convention for zero, we open a can of worms. I see other conventions for charting zero and most have merit at some level. As a community of applied scientists we must figure out which hold the most merit and just go with it (until someone comes up with a better one or a convention that has the most reasons to persuade our measurement decisions).

I would love to keep working over the zero issue (no pun intended). The more we talk and debate the pros and cons of different ideas the closer we move to consensus!

Rick

Hi Rick-

I like the convention of ? for zero because it says “I don’t know.” On the other hand, you have advanced good reasons for plotting a dot at X2 below the time bar. So I support the convention you are advancing and I will abandon the “?” that I have used previously. Now there is another line of logic to develop when we think about how the X2 rule – how does it interact with variability (bounce)? It seems to me that when we set the convention at X2, we are also making an accommodation for bounce, because even if the average frequency of a behavior is below the time bar, if its bounce is big enough we will see it occasionally. Thus setting the convention at X2 makes conservative assumptions about both frequency and bounce.

Thanks

Chuck

Wow, thanks for such a high compliment!

Og worked with three students in early 80s, two published dissertations and a third an MA thesis. All studies involved recharting JABA data. Through Og and his students’ work it has become clear to me why he advocated the x2 below the time bar rule: it allows us to fairly and consistently rechart data and analyze it clearly. The zero convention really helps with bounce, especially total bounce. We can’t say a behavior has reached a level where it is no longer an issue until both the up and down bounce have reached zero. With the zero convention it becomes clear.

By the way, check out some of these conclusions from Ehling (1986) concerning JABA:

“Only 44 percent of the recharted JABA studies had before treatment phases with minimal < *1.1 to /1.1) celerations. The remaining 56 percent disprove the commonly held notion that behavior is constant before a behavioral intervention is begun." "Researchers are losing their minimal ability to produce celeration turn-ups. The ability to turn celerations down has maintained at /1.5 since 1968." "Undeterminable bounces are accelerating much more rapidly than determined measurable bounce. In 1968 one out of every six bounce measures was below the floor (undeterminable>. By 1984 one out of two measures of bounce were undeterminable. If this trend continues, undeterminable bounces will outnumber those charts with determined bounce by 1990.”

“Educators should rechart the results of research published in JABA before applying the research intervention in their setting. Without recharting, during treatment celeration, frequency jumps, celeration turns, and bounce measures are obscured in the published JABA research articles.”

Hi Rick,

To be a bit more emphatic, I think that the PT world ought to use your convention and be done with it. It works. It works better than the .9 stuff, which is hard to discriminate from the Record Floor. Your convention is consistent. Go with your convention and move on.

As a field we have much more important issues, problems, and situations to contend with to be debating how to chart zero: Like keeping the SCC a viable entity in a BCBA’d world that — increasingly — seems to disdain science.

John,

I agree!

Rick