Archives

OPEN MEETING OF
ADVISORY COMMITTEE ON PUBLIC INTEREST OBLIGATIONS OF DIGITAL TELEVISION BROADCASTERS


Tuesday, April 14, 1998
9:40 a.m.

National Association of Broadcasters
1771 N Street, N.W.
Washington, D.C. 20036


Transcript of the Morning Session

[View the transcript of the afternoon session]

                   P R O C E E D I N G S
                                               (9:40 a.m.)
                WELCOME AND OPENING REMARKS
         LESLIE MOONVES, PRESIDENT, CBS TELEVISION
          MR. MOONVES:  Welcome, everybody.  It is good to
see everybody again.  We had a very interesting meeting in
Los Angeles last month.  I think a lot of was
accomplished.  I think a lot of issues were put on the
table that were necessary to think about.
          We have a terrific day planned.  In the morning,
we will be dealing with certain issues from the NAB.  And
I would like to thank Eddie Fritts and the various members
of the NAB for hosting us today.
          In the afternoon, Norm will begin to lead the
deliberations.  As you know, our target is to have a paper
prepared by October.  And time is running down quickly --
quicker than we would like to think.  So I think today we
would like to start getting into the substantive issues.
          Robert Decherd, and Gigi Sohn, both presented us
with some interesting material, which I think we want to
get into this afternoon, as well, which I would like to
thank them both for beginning the process of getting some
things down in writing.  In addition, Karen Strauss has
added an amendment to something that Gigi has written,
which I do not think has been passed out yet.  Oh, it is

                                                         3
here.  Terrific.  We will also talk about that this
afternoon.  So we have a lot to do.
          In terms of future meetings, we will be in
Minneapolis for the next meeting.  That date is June 8th. 
Mr. Crump will be our host, and we look forward to being
in Minneapolis -- far better than being there in June -- I
think rather than January, it will be better to be there
in June.
          Norman, anything you would like to add?  Any
opening remarks?
                      OPENING REMARKS
            NORMAN ORNSTEIN, RESIDENT SCHOLAR,
               AMERICAN ENTERPRISE INSTITUTE
          MR. ORNSTEIN:  Let me just add my thanks to the
NAB for hosting us today.
          I am sorry -- I gather that Eddie Fritts has
been ill for a while and cannot be here to greet us
directly.  We wish him a swift recovery.
          I did want to note that you have in front of you
a piece from Broadcasting and Cable, which is about the
survey, generally, that we are going to hear about this
morning.  And there is an interesting quote from
Mr. Fritts that I just want to point out to all of you,
which I think is in the spirit that we are trying to
achieve here, of moving forward with proposals and trying

                                                         4
to come to some general agreement.
          This is a direct quote:
          I will make a deal tomorrow with the Congress of
the United States that says the following:  We will give
you 2 hours of broadcast time to run your campaign for
Federal candidates only.  However, you will not be able to
buy any additional time.
          I think we would all agree on that.  But,
certainly, as a laudable goal, we have some ideas out on
the table that are coming right from the broadcasters. 
And we will see if we can continue that spirit as we go
along.
          And I guess we should start with the
presentation of the survey results.  And Paul LaCamera is
going to introduce our panelists.
          Paul, it is all yours.
                         BRIEFING:
     SURVEY OF BROADCASTERS' PUBLIC SERVICE ACTIVITIES
               MODERATOR:  PAUL A. LACAMERA,
          PRESIDENT AND GENERAL MANAGER, WCVB-TV
          MR. LACAMERA:  Thank you.  And good morning.
          Over these past months, we have heard from an
opposing array of voices and interests.  And today I and
my broadcast colleagues on this panel thank you for
allowing the National Association of Broadcasters to add

                                                         5
its perspective to our deliberations.
          As our group was formulated last spring and
summer, with the charge of defining public interest
responsibilities of television operators in the pending
digital era, the NAB realized that the current community
service activities of stations needed to be documented in
some formal way.  While there was a vast amount of
anecdotal evidence of the good works of broadcasters and
their efforts to serve their respective publics, there had
never been a comprehensive effort to survey the entire
industry to aggregate the actual ways in which local
stations are benefitting the communities they are charged
to serve.
          To that end, the NAB retained the
Virginia-headquartered firm Public Opinion Strategies to
conduct such a census of both radio and television
broadcasters throughout the country.  In addition to the
survey, which had a remarkable 63-percent participation
rate from television stations, NAB and State broadcast
associations conducted more than 500 follow-up qualitative
interviews to contribute greater meaning to the cold
numbers obtained in the census.
          Bill McInturff, of Public Opinion Strategies is
our guest today, to report on the impressive findings of
this exercise.  This survey affirms what Bob Decherd and

                                                         6
other broadcasters on this panel have tried to articulate. 
The stations represented here are not anomalies, but are
simply part of a broad commitment to community service
that runs deep throughout our industry.  The work of the
BELO stations and markets has been documented and reported
to us.  NBC President Robert Wright recently made calls at
the FCC and Congress specifically to share the community
service performance of the powerful NBC owned and operated
station body.
          And I can personally attest to the good works of
the stations in my home Boston market, and of the 15
stations that WCVB-TV's parent First Argyle Television
Group.  Again, it is hard to accept that these stations
and groups are simply anomalies, not representative of the
larger industry in which we and they must perform and
compete.
          As we listen to Mr. McInturff, it is important
to note also what the NAB survey did not cover.  As Bill
will explain, the NAB asked for information on activities
that could readily be quantified:  PSA's, fundraisers for
charities and health organizations, political debates, and
other examples of providing access to political
candidates.
          But as we well know and would appropriately
hope, the full worth of a station's responsiveness and

                                                         7
service to its community goes far beyond these obvious
measures.  The heroic performance of broadcasters during
this year's numerous weather emergencies and literal
disasters in so many parts of this country is but the most
current case in point.
          As we hope the NAB survey also indicates, local
television broadcasters in this country continue to
believe in and embrace the ideals of localism and
community service.  To be frank, in this era of growing
competition and fragmentation, it is in our enlightened
self-interest to do so.
          Against the backdrop of that truism, let me
introduce Bill McInturff, of Public Opinion Strategies. 
Also joining us is Jack Goodman, of the National
Association of Broadcasters.
          Gentlemen.
                   COMMITTEE DISCUSSION
              WILLIAM D. MCINTURFF, PARTNER,
              PUBLIC OPINION STRATEGIES; AND
     JACK GOODMAN, VICE PRESIDENT AND POLICY COUNSEL,
           NATIONAL ASSOCIATION OF BROADCASTERS
          MR. MCINTURFF:  Thank you, Paul.
          I will just introduce myself again.  My name is
Bill McInturff, with Public Opinion Strategies.  And we
are going to just walk through some of the major findings

                                                         8
of the work that we did.
          This is a little awkward for our co-chairs and
Karen, but it will be on the screen behind you as we are
speaking.
          I am here today to present a national report on
the broadcast industry's community service program.  Let
me just again highlight some more about the methodologies,
so that you can understand what it is that this survey
covered.
          We started this effort with a two-State pretest,
where we designed the questionnaire and then administered
it via mail, in Arizona and Minnesota, just to confirm
that we could collect a high enough response rate and to
confirm that we were dealing with a document that was
giving us valid and valuable data.
          We extended that project, with some modification
of the questionnaire, to 48 States.  The State
associations of the State broadcasters associations were
responsible for mailing the NAB and State association
members in their States.  And we mailed those in October,
and then did extensive follow-up to try to drive response
rates as high as possible.
          In addition, as you heard, we conducted and
helped write the script and conduct 500 interviews, 10 per
State, just so that we could get an understanding of the

                                                         9
fabric of this data, of the kinds of stories, the
vignettes and examples that underline these quantitative
findings.
          And then, finally, we completed this project by
asking the four major networks to provide comparable data
so that we could layer on the network data.
          In terms of who we heard from:  We mailed over
1,100 TV stations around the country.  We got 730 who
responded for, again, a very unusual and impressive
63-percent cooperation rate.
          In terms of radio stations, we contacted and
mailed almost 8,000 radio stations around the country, for
a response rate of 39 percent.  And what this means -- and
just to conclude on what we did -- obviously, with four
networks, we received the cooperation of all four major
networks, ABC, CBS, Fox, and NBC, in terms of cooperating
and responding -- what this means is this is not a poll. 
It is not a survey.  It is what is called a census.  And
that is that this is a report of every State association
and NAB member around the country in all 50 States.
          We received a 42-percent response rate.  And let
me talk and put that in perspective.  There are two
reasons that is important.  And, Bob, if you could go on
to the next slide.
          One reason it is important is because we know we

                                                        10
have a fixed universe, it allows us to calculate a margin
of error.  And it means that the margin of error for this
study is unusually small.  It is about 1 percent.  But the
42-percent rate means something else.  What it means is
when you look at the methodology in the mailed
questionnaires is there is always the concern that if you
get too low of a response rate, if only 15 or 20 percent
of the people have cooperated, how do you know that the
people that responded you can project for people that you
did not hear from in the survey?
          In the writing, and if you read methodology,
what people argue is the higher the response rates and the
larger the database that you have, the easier it is to
project that data onto the people that did not respond. 
And so a 42-percent response rate across both TV and radio
is very, very high.  We do a lot of membership studies. 
Most of the time membership mail surveys are in the 20 to
30 percent response rates.  And especially for TV, where
we are at 60 percent-plus, we just know an enormous amount
about thousands of stations who took the time to cooperate
with this project.
          And, again, the second thing that makes this
unique is that, as I said, it is not a survey, but it is a
census.  That is very, very unusual.  I did not offer you
just one contrast.  We are doing a lot of public health

                                                        11
research these days.  And in terms of public health data,
the Centers for Disease Control contracts with 80
hospitals around the country.  They keep enormously
detailed emergency room records in those 80 hospitals.  We
have 8,000 hospitals in the country.  And then they
project from 80 hospitals to 8,000.
          In this exercise, we are essentially taking the
numbers from 730 TV stations and 3,000 radio stations and
projecting them to the rest of the known universe.
          What do we measure in this survey?  What did we
try to quantify and provide a dollar figure for?
          We essentially provided dollar figures for three
items.  One is the value of public service announcements,
and specifically asking TV and radio stations around the
country to provide an estimate of both the number and the
value of 30-second PSA's that are run on their stations.
          The second thing we tried to provide a dollar
figure for was the money that is raised for charitable
groups around the country.  That is, specifically, how did
these local broadcasters leverage their air time to help
charities, charitable causes or needy individuals through
their fundraising efforts?
          The third thing that we measured was the value
of free political time voluntarily provided by these
stations.  And the definition of political time was quite

                                                        12
exact.  Which is did you offer to hold debates, fora,
candidate interchange on air, or as well we did also
estimate the value of the convention services provided by
the major networks.
          What didn't we measure?  We provided no monetary
value -- we measured some of this activity -- but we
provided no dollar or monetary estimate for the value
provided by local news and for public affairs programming
other than candidate-specific activity.  And so what do we
mean by those kind of public affairs shows?
          That might be a station in Philadelphia that for
20 years has done a half-hour show a week targeted to the
Hispanic community.  Or for those of you from the
Washington area, something like Gordon Peterson's Inside
Washington.  That is regular public affairs programming. 
The value of that has never been calculated and is not
part of this project.
          We did not try to provide an estimate for the
dollar value of the employee volunteer time for the local
broadcast employees, in terms of their charitable
activities.  Although, again, through the qualitative
process, it is striking how much volunteer work is being
done by the stations and their employees.
          We did not also track the direct cash donations
from these broadcasters to charities.  And we did not

                                                        13
provide any financial estimate for the children's
educational programming requirement.
          And, finally, although we did keep track of what
kinds of service and what kind of money was raised in
disaster and emergency situations, we did not try to
provide a dollar estimate for the kind of time that is
donated when stations preempt advertising due to local
weather or other emergencies.
          So none of those which you see in the right-hand
column, none of the figures that we are showing account
for or try to provide any monetary estimate for all those
other ways in which local broadcasters might contribute to
the community.
          The other point I would make is that in terms of
the census we did with these local broadcasters around the
country, we asked them to report on a 1-year time period. 
That 1-year time period was from August 1, 1996, through
July 1, 1997.  We picked the 1-year time period for two
reasons.
          One, I wanted to go back and include the last
presidential and major election cycle in this
documentation, so we could get an accurate read of what
these stations did during the last major political cycle. 
I also selected a 1-year time limit because I think, over
time and over the 1-year window, you get a realistic read

                                                        14
of the kinds of activities that may not go on during any
discrete time period during the year, but would happen
during the course of an entire year.
          Now, in terms of these measurements, let us
start first with public service announcements.  What the
local TV stations told us is that around the country,
around the 730 TV stations that we heard from, they
average 137 PSA's a week.  And using their estimate for
their run of schedule rates, that is roughly about a
$1 million per station, or about a little over a billion
dollars of PSA's being provided by local TV stations
around the country.
          In terms of the PSA's by the major networks,
here I used a median figure of roughly 41 PSA's per week. 
And, again, roughly -- well, not roughly -- this is an
exact figure for these four stations -- $342 million per
year for the four major networks in terms of PSA activity.
          For the radio stations, the radio stations
report a little lower number, both for the number of PSA's
they run across these 3,000-plus stations, as well, of
course, the lower dollar value for a 30-second run of the
station spot.  So they say that averages around $400,000
per station, or more than $3 billion in PSA activity on
the 9,000 radio stations that are part of this survey
audience.

                                                        15
          So, in total, for PSA activity, that accounts
for $4.6 billion last year, from August 1996 through July
1997 time period.
          And, again, as part of the qualitative
interviews, we asked stations what kinds of things are you
talking about in terms of PSA's.  Here are just some of
the things that we heard.
          From KRTV in Great Falls, where, for example,
that station sponsors a Voices Against Tobacco, which
allows kids to produce their own PSA's, and the station
runs the best of the PSA's that were designed by kids.
          In Spokane, Washington, KXLY and their radio
affiliate -- this is, again, a very traditional and
characteristic kinds of things that we hear about, where
we both track PSA activity, as well as kind of either the
money or the other activity that it helps generate.  And
that is that their efforts to help promote, drive and
organize more than 200 volunteers at collection sites, to
collect coats for the homeless and others in that city.
          In terms of fundraising activity, when we spoke
to disease groups and other major volunteer groups, they
talk a lot about what the benefit provided by the
sponsorship of local broadcasters around the country.  And
specifically, we asked those broadcasters how much, based
on your on-air and other work that you do for charities,

                                                        16
would you say that you have helped raise in the community?
          For TV stations, that averaged about -- not
about -- it averaged $867,000 per station, or more than
$900 million per year for the TV stations around the
country.
          Again, radio was far less substantial in terms
of individual dollar volume.  And that is the radios say
that their sponsorship and leveraging their on-air to
promote charities and to help raise funds in the community
averages about $163,000, or more than a billion dollars a
year around the country.
          Again, sample activities, the kind of things in
our qualitative interview we heard about.  In Minnesota,
the five TV stations did a roadblock, a roadblock meaning
they all five ran the same telethon at the same time,
which raised about $200,000, plus video sales for the
victims of the massive flooding in that State.
          In Pittsburgh, WTAE described their 5-year
effort helping promote Race for the Cure, which has
provided vouchers, so that uninsured women can receive
mammograms.  Now, in the qualitative interview, they
provided a dollar figure, over a 5-year time period, of
$1.85 million.  But, again, I would say that, explicitly
in the questionnaire, we were tracking only the dollar
volume in any 1 year.  So this $1.85 would not have been a

                                                        17
figure from the quantitative.  It is a figure they
provided in the qualitative interviews.
          So in terms of that fundraising activity, that
is $2.1 billion in terms of how broadcasters leverage
their on-air activities to help raise funds for the
community.
          Looking now at political time, the third major
thing that we measured.  What we asked was, in 1996, we
said, did your station offer to sponsor candidate forums,
including debates or other air time for which political
candidates would not be charged?  And half the TV and half
the radio stations in the country said, yes, that they had
offered that free air time for candidates for that
purpose.
          In addition, we asked a different question. 
Which is, did your station offer to sponsor debates or
forums sponsored by other groups, other than the station
itself -- the classic model being the League of Women
Voters in many States.  And, again, one in five TV and
radio stations said yes, that they had made that offer of
sponsorship with a different organization.
          Here are some examples.  And it just shows you
kind of I think some of the range.  In Massachusetts,
there were seven debates held.  And this gives you an idea
of the number of debates covered by each of the different

                                                        18
stations.  And so what you see here is that there were
over 14 debates covered between the major TV and radio
stations out of the seven debates between Kerry and Weld.
          And there is a range of options that are going
on around the country.  In a smaller station, WTOK, in
Meridian, Mississippi, the station sponsors debates, but
they are held in the form of town meetings, in a town
meeting format, where people in the community can directly
ask the candidates questions.
          Another kind of activity we hear about a lot is
radio.  WMCS-AM in Milwaukee, who talked about what they
do, which is they set up sample polls at 25 high-traffic
spots, to encourage voting and to educate voters.  And we
will see in a minute the extensive range of activity to
increase turnout.
          Hopefully, no one here is from South Carolina. 
And I am going to take a pass at my frequent attempts to
try to pronounce this community's name.  But WRIX Radio,
which I note to you and I can promise you is a smaller
community in South Carolina, has offered and gives 15
minutes of speech time to candidates that are all
broadcast on the same day.
          In Wisconsin, which is another State that has a
history of active political activity and debate, there is
a hookup, where, since 1990, roughly between 18 and 20 TV

                                                        19
stations in the State and 80 radio stations carry debates
live, simultaneously, around the State of the major
statewide and Federal candidates.
          Now, in addition, though -- and this is the
political world in which we live -- there is a substantial
amount of activity that is offered by these broadcasters
that is refused by candidates.  And so, as a separate
question, we said, now, in 1996, were any of the offers
you made for debates or forums rejected by candidates? 
And here we are roughly a third to about 40 percent saying
that free time was offered to a campaign, but the time was
refused and not used by the campaign.
          A classic example is, in North Carolina, where,
in 1996, Senator Helms, for internal reasons to his
campaign decisions, refused to appear anywhere with his
opponent.  And so, despite multiple offers throughout
North Carolina to sponsor debates or forums, the campaign
turned down those requests.
          We can and we did measure specifically the value
of the time rejected, which the stations told us would
have been $15 million of time around the country.
          Now, in addition, as we look at other political
activity, we asked, did your station air a local public
affairs program or segment, other than your news
broadcast, that dealt with the 1996 elections?  A little

                                                        20
less than half of the TV stations said yes.  Roughly
two-thirds of the radio stations said yes, they did.
          And then, specifically, looking in a more
detailed way, we said, now, other than reporting on the
progress of campaigns, did your news programs do any
special segments profiling candidates and/or their stands
on the issues?  And, again, here we have a higher number,
where about two-thirds of TV stations said yes, about half
of the radio stations said yes, they did this as part of
news segments.
          And then, finally, we asked, in 1996, did your
station appeal to audiences to vote, either through PSA's,
public affairs programming or the news?  And here we have
essentially every -- you know, functionally, every station
in the country saying that their station was involved in
trying to increase and encourage turnout.
          Looking specifically now at network political
time.  We asked the networks, how many debates did your
network broadcast?  And what we learned is that three of
the networks ran all three debates.  There was one network
who ran one of the debates.
          We asked the networks, how many hours of
programming, live programming, did your network devote to
live convention coverage?  Across the networks, the total
was 27 total broadcast hours of convention coverage.

                                                        21
          So as you aggregate the time that was used for
candidate debates and forums around the country by local
broadcasters as well as the convention activity, there is
$148 million of free political time being used for these
purposes.  So as you kind of calculate the dollars and
where the dollar figures came from, that is $4.6 billion
in PSA's, $2.1 billion in local community fundraising and
$150 million of political time, for a total of $6.85
billion of this kind of community service and the economic
impact of this community service around the country.
          But when we talk about community service and we
talk about local issues, what kinds of issues are we
talking about?
          Based on our pretest in the two States, as well
as our phone interviews, in the questionnaire that we did
we tracked the specific activity on the range of issues
that you see in front of you -- from AIDS, to fundraising,
drunk driving, drug use, hunger, poverty, homelessness,
drinking during pregnancy -- because both based on the
pretest we did and on the phone interviews, these were the
topics that we were hearing all over the country that were
the focus of the PSA and the public affairs activities.
          We wanted to look at the impact of whether or
not this was locally based or not.  And so we asked
people, of the number of PSA's that your station runs,

                                                        22
what percent are locally produced or about local issues? 
TV says about half; radio says about two-thirds.  I think
that is a function of production costs.
          Then we asked, did you consult with your
community leaders in deciding which issues and causes for
PSA's and programming would be appropriate for you
locally?  And, again, what you see is three-fourths and
two-thirds of TV and radio executives saying yes, they had
specifically worked with community leaders to decide what
ought to be the focus of their public service PSA's and
their public affairs programming.
          You see variation by size of market.  You see
variation by region.  Remembering that we have only a
1-percent margin of error, these are very large
differences.  And I am just giving you some examples so
that you can see and get a feeling for how communities of
different sizes focus on different issues.
          And so, for example, when you look at the topic
of AIDS, 80 percent of the people who serve markets of
more than a million people said that they did PSA's on
AIDS, compared to only 70 percent in these very small
markets.
          When we asked people about, did they do specific
public affairs shows or did they run specific public
affairs sections of their news program devoted to

                                                        23
anti-violence campaigns and efforts?  Again, here you are
seeing a dramatic difference by size and market, which
again reflects, I think, the level of concern in each of
these communities by market, in terms of the need and the
application of this kind of public affairs programming. 
Where we are at 72 percent in the million-size markets,
and almost 25-30 points lower in very small markets.
          You see the same kind of variation when you see
hunger and homelessness.  Where, again, the major urban
markets are much more focused on this issue than the very,
very small, rural markets.
          And so I think, in summary, what this report
documents, across what is the first total census that I am
certainly aware of an industry, in terms of its public
service activity over a substantial database of almost
4,000 respondents, that you can measure in some stable way
the significant activity of these local broadcasters, that
you see and you can talk in these qualitative and these
other interviews how local concerns are affecting and
impacting coverage, and there is a way to document the way
in which these stations are serving the public interest.
          And so with that as an overview of the data, I
look forward to answering your questions.  Thank you.
          MR. LACAMERA:  Bill, thank you very much.
          Are there questions for either Bill or Jack?

                                                        24
          MR. CRUMP:  I have a question.  I am wondering,
in the fundraising number that you gave, you did not
specifically mention this, so I am just curious of whether
or not you included the amount of money raised on the
national telethons?
          MR. MCINTURFF:  No.
          MR. CRUMP:  Well, that would be a significant
addition, I would point out, in that I think of only two
at the moment, which is the Muscular Dystrophy Association
and Children's Hospitals, and though I do not have the
specific number, I know I am very close when I say that
each one of them raised approximately $50 million last
year, which would be another $100 million added to this. 
And of course that all comes from local stations, who are
participating in this, raising local monies.
          And in addition to that, there are other
millions of dollars, at least with the Muscular Dystrophy
Association, that are raised by commercial companies in
order to be able to participate on the television program
itself.  So that would be, to me, a rather significant
number also that we should figure into this thing.
          MR. ORNSTEIN:  Bill, I just wanted to ask a
couple of methodological questions.  I do not want to get
bogged down too much in detail, but I have done some mail
surveys, over time, myself.  And there is no question that

                                                        25
a response rate of over 40 percent is very impressive, but
I have always wondered myself, when you ask questions that
have at least some charge to them -- in this case, the
stations that do not do much public service would be more
likely to avoid answering.
          And so the 1-percent margin of error, do you
have any concern that perhaps the stations that did not
respond might be those that did not want to respond
because they are basically not doing very much?  Or are
you fairly confident that in fact the stations that did
not respond would really fit this profile?
          MR. MCINTURFF:  That is a good question.  I
think that what I feel comfortable with is that in
addition to this quantitative effort, that we did those
500 interviews with stations, some of those interviews
included stations that did not respond to the quantitative
survey.  In other words, they did not fill out the
quantitative survey, but they were still part of the
people we interviewed.
          And in those cases, as well, we heard about
significant PSA, community service, and the same kinds of
activities.  And there is, you know, again, when you are
talking about non-respondents, I think you, as a pollster,
have to be very cautious, trying to characterize those
people.

                                                        26
          The other thing I can say is that in large
States -- Pennsylvania, Texas, California, and others --
we specifically polled the list of stations that did not
respond.  And in those large States, where I have worked
for a long time in the States and we had specific help by
local broadcasters, I was fairly comfortable that the
non-respondents were distributed across markets.
          In other words, it was not just that we were not
hearing from little guys, big guys, that we did not get
stuff from one region of the State, that the
non-respondents looked to me, in those States where we
polled -- and we polled three to five as an example from
the major States -- the non-respondents looked like, in
terms of the size of market, type of station, that they
replicated and were pretty close to the people who did
respond.
          This survey was fairly laborious.  It required a
lot of polling of specific numbers from the stations.  And
I think the other thing that you know, Norm, in terms of
response rates, is response rates vary by how easy is the
information to collect.  This was a fairly difficult
survey for stations to do.  And so I believe the
non-response rates are much more a function of the time
commitment and interest than it is that it is some
systematic bias in terms of people not responding because

                                                        27
they did not do these kind of activities.
          MR. ORNSTEIN:  Okay.  One other question.  You
show some considerable variation across different markets
and in other ways.  Did you find the distribution overall
or within each of these areas -- the PSA's, the donations,
the political stuff -- fit for the stations that you
surveyed, generally, a bell curve distribution, that with
some stations where there is a normal rate, some stations
doing very, very little and some stations doing an
enormous amount?  And was that true overall?  Was it true
within each of these areas?
          MR. MCINTURFF:  Again, these all reflect -- you
know, this is a riveting conversation about the difference
between means, medians and ranges when you look at scores.
          MR. ORNSTEIN:  Yes.
          MR. MCINTURFF:  But they are important points,
and I would be happy to address them.
          Number one, in the written summary that has been
provided at the Commission, we provided some samples of
differences by size of market.  And so, obviously, you
have enormous comfort because you can see, in major urban
markets, enormously higher figures for dollar values
compared to smaller markets, which, you know, confirms any
kind of economic sense.
          The entire question of when I looked at the

                                                        28
database was, what is the best representation of these
numbers?  Should we use a median?  And for those -- just
as a quick reminder, a median means if you have 100
respondents, you should pick the number where 49
respondents are on one side and 49 respondents are on the
other side, and this is the number in the middle.  That is
the median.
          The mean, the average, which is what we did, the
average number is of the 100 people, what is the total
value of what they did divided by 100.  And so I agonized
and worked very carefully to decide what is the best
representation of this database, median or mean.  And at
the State level, we have done individual State reports. 
Other than the top 10 States, I said we must use a median. 
Because if you are in Connecticut and there is only three
or four TV stations and you get one station responding, I
do not think it is legitimate to take that one station and
make a representation about the others.
          However, in the top 10 States by population,
when I tried to make an argument nationally that we should
use a median as the most cautious number, what we found
was the fundraising is a hard number.  People told us
exactly how much they raised per station.  And so as a
consequence, in California, what happened was that the
actual dollars raised by the stations reporting was higher

                                                        29
than the median for the survey.
          Because, in that case, we have a station in Los
Angeles that did an extraordinary amount of activity --
like $10 million.  And so these numbers do reflect that
range of activity.  There is totally some stations that
are out there doing $10 million of fundraising and others
that are not active at all.
          But across this database, across 4,000
respondents, when you look at -- and you can track an
actual hard number like fundraising -- the mean scores
were the best way to represent the national database.  And
so what I have done and what is embodied in this work is
the national numbers reflect means and averages, the top
10 States reflect mean and averages, and below the top 10
States the data at the State level has been reported using
median figures, because I felt that was the fairest
representation of the data.
          MR. ORNSTEIN:  Let me take it beyond means and
medians to variance, then.  Because, you know, just
anybody eye-balling their local stations would see, it
would seem -- without doing a systematic survey as you
have done -- a tremendous range.  Some stations just do an
enormous amount across all of these areas of public
service.  And some stations seem to do very, very little.
          What kind of variance did you find?

                                                        30
          MR. MCINTURFF:  Well, I think you use the
variance in terms of dollar volumes.  But, again, across
4,000 respondents, that averages out.  But the other thing
you need to look at in the data here -- and we have the
actual questions as they were phrased as part of the
material you were given -- is this is where you have to
look at the fact that 92 percent said they are active in
promoting get out the vote; 92 percent of all the stations
responding said they do something to help charitable
organizations; half say they do something to offer free
time.
          Those are very, very flat numbers across all
these respondents.  And I think what it reflects is -- I
think where you see the range is in the actual dollar
volumes the stations might do -- what I think this
reflects is that most of the large, commercial
broadcasters are indeed -- and these stations are -- doing
this kind of activity across all these stations.
          And again, clearly, the three or four major
network stations are different from the WB and Paramount
stations, they are different than Christian stations.  And
there are stations that are nowhere near as active.  But I
think the point to focus on is that, across this database,
what you are hearing about is, again, 50 percent offering
free debate time, 90 percent helping in get out the vote,

                                                        31
90 percent saying they help local charities -- that the
stations are doing each of these kind of activities, but
in different dollar volumes.
          MS. CHARREN:  In one slide -- I think it was the
one with 63 percent ran special segments and 44 percent
ran local public affairs programs -- if they ran one
special segment, can they say yes to that question?
          MR. MCINTURFF:  Yes.  It was, during the course
of the year, have you done any of the following?
          MS. CHARREN:  So if they did one special
segment, which could have been 5 minutes, they would be
included as yes, they air local affairs programs dealing
with this, right?  There was no effort to quantify how
many segments or how often?
          MR. MCINTURFF:  The answer is yes to your
question.  If they had done one, they would qualify as a
yes for that purpose.  Then that is the reason, when we
mentioned in the beginning -- you might remember what I
said about what we measured and what we did not measure --
and I said the dollar volume of what we measured was only
the segments that were offered to candidates for debates,
for candidate forums, et cetera, et cetera.
          Exactly for the reason you mentioned, that is
why I did not, in designing the questionnaire, ask people
to try to provide a dollar volume for every public service

                                                        32
activity.  Because the variation, about how much is being
done under what kind of formats -- is it a half-hour
program, 5 minutes on the news, 7 minutes on the news --
you know, this got too complicated, in terms of trying to
do an exact dollar figure.
          And that is exactly why, in terms of the design
of the questionnaire, I said, let's measure exactly the
limited and restricted use of public affairs for candidate
debate and forums.  Let's not try to put a dollar volume
on public service for that kind of public service
programming.  And that is why no figure has been provided.
          MR. LACAMERA:  Why don't we move around the
table this way.  Newton, I know you have a question, and
then we will come around this way.
          MR. MINOW:  I am particularly interested in the
debates.  I am on the presidential debate commission, and
have been involved in all the presidential debates for
more than 20 years.  If we took the suggestion, which I
regard as a very good one, from Eddie Fritts, that Norman
mentioned, that a certain amount of free time be given to
the candidates, but the candidates could not buy time,
what would your estimate be -- remember, you said the
number of people who refused to debate was very high -- if
you could not buy time, don't you think everybody would be
debating?  Wouldn't that be a good public interest

                                                        33
solution of this problem?
          MR. MCINTURFF:  Well, I can speak here with a
different hat.  In addition to the very substantial volume
of this kind of research that our firm conducts, I also
happen to have roots as a partisan pollster.  And I think
the point that Eddie Fritts was making is that if you gave
a campaign professional the choice between 2 hours,
blocked out, of time over the course of the campaign
versus his or her ability in the campaign to control their
message through spot advertising, that political
professionals would choose the latter.
          And that in terms of volume of information and
the number of people reached and contacted, that a
substantially higher number of people would be exposed to
the campaign through that advertising than would be
exposed to the campaign through an aggregate total of 2
hours of advertising.
          So, to answer your question directly, I think in
this case that the people who do campaigns for a living
would not choose that offer.
          MR. MINOW:  But should we leave this to the
people who run campaigns for a living to decide, the
professional campaign consultants?  All the voters, if you
believe every study of the presidential debates, the
voters prefer the debates as a way to learn about the

                                                        34
candidates and the issues to what the political
professionals prefer.  And isn't the public interest to
please the voters and not the political professionals?
          MR. MCINTURFF:  Well, again, what I would like
to do is to answer the questions about the research and
let the people here who are -- that is exactly your
mission.  And I think that is beyond the scope of this
initial research, in terms of what I am here to present
today.  I will say, as a comment, our firm worked with Fox
in 1996.  They asked us and another pollster to come up
with 10 questions to ask the presidential candidates that
they would then put on the air.
          CBS, with Dan Rather, had blips of time.  And
there was a fair amount of rating information during that
programming, both for CBS and for Fox, that shows a
dramatic drop-off in viewership once the candidates were
on for that length of time.  So that is at least another
thing that should be considered, given the information
that all these networks have, about viewership during
those kinds of segments with candidates.
          MR. MOONVES:  It was over a 75-percent drop
during the regular local news when the candidates came on.
          MR. LACAMERA:  When you ask a question, I have
been asked to encourage you to be sure you talk into the
microphone.

                                                        35
          Frank.
          MR. BLYTHE:  I just want to go back to some of
the response numbers again.  You mentioned that the usual
response to any of these surveys was about 21 percent from
the stations.
          MR. MCINTURFF:  No, sir.  I said most mail
questionnaires, even with association members, range
between 25 and 35 percent.
          MR. BLYTHE:  Okay.
          MR. MCINTURFF:  So I was giving kind of a
national aggregate from lots of other research that is
traditionally done.  I am sorry if I was not clear about
that.
          MR. BLYTHE:  All right.  But still, the response
was fairly high on this survey here.  I would just be
interested to know what was your motivation to get the
stations to respond to a survey like this, because these,
as you said, were pretty laborious to return and consumed
a lot of time -- and I get surveys like that, too, and you
had to make a decision of whether you want to commit all
that time -- and if there was any, I guess, mention of
this committee's deliberations on the public interest
obligations that precipitated stations to respond?
          MR. MCINTURFF:  I think that in the cover letter
and the other information we clearly communicated that

                                                        36
there was going to be an interest in this information,
given the public mood to want to track what broadcasters
do.  The local associations really worked hard to collect
response rates.  There was one other incentive, which is,
again, as a research firm, these States wanted to have
State-specific reports of not just a national number but a
State number.  And our firm refused to -- said we would
not provide State reports unless response rates were at
least over 35 percent.
          Because of the projection problems that Norm has
raised, I just felt that inside a State, if response rates
were below 35 percent, that I felt I was not, as a
researcher, comfortable with projecting in-State numbers. 
And so, in terms of the long length of time, when I say
that these were mailed in October and we finally collected
them in January, we had -- I bet you would be happy,
Norm -- just exactly the kind of bell distribution you
would hope.
          We had eight or nine States that were fabulous. 
And we had eight or nine States that were not doing
particularly well.  Those States were identified by early
December and January, and then they were re-contacted, in
terms of the people who did not respond, by our giving
them the list of non-responding stations, and essentially
the States were told and said to those folks, look, if you

                                                        37
do not get these done, there is not going to be a State
report because we are not going to provide State
estimates.
          And I think it is clear -- and I can tell you
why the TV response rates are higher -- because when you
have 1,000 TV and you have got 9,000 radio stations inside
a State, you can make eight calls to every TV station and
say, please fill this out, and that is harder times
thousands of radio stations.  And I think that entirely
explains the difference in response rates between the TV
response and the radio response.
          But in terms of the incentive, I think the
incentive really became the interest for the State
associations to want to talk to the political figures in
their State and others in their State, and the community
leaders and business community and others, to be able to
provide those State reports, which we have done in all 50
States.
          MR. LACAMERA:  Anybody else on this side?
          MS. CHARREN:  I have one other question.  Do you
know of any research that tracks corporate giving in other
kinds of corporations, helping raise money in the
community, helping with the kind of charitable giving that
does go on in this country outside the broadcast industry? 
Are there studies that have tracked that?

                                                        38
          MR. MCINTURFF:  I am not familiar with them. 
But I am sure it is very possible one of the very large
foundations has done that.
          I will say that as part of this effort, separate
from this effort, we have also done focus groups with
voters.  We also contacted and did phone surveys with
leaders of charitable organizations around the country. 
And I can tell you from the survey work we have done with
the recipients of this assistance -- these disease groups,
the Boy Scouts, United Way, Salvation Army, et cetera,
et cetera -- they are, in the interviews that we did,
enormously supportive and appreciative of the extra boost
that comes through the participation of local
broadcasters.  And that is clearly represented by the
qualitative interviews that we did with the recipients of
that aid.
          MR. LACAMERA:  Richard, did you have a question?
          MR. MASUR:  I have a couple actually.
          First of all, on PSA's, when you asked the
question, what percentage are locally produced or about
local issues, is that exactly how the question was
phrased?  Because I am curious as to whether or not, first
of all, conflating those two -- locally produced or about
local issues -- I do not quite grasp what the connection
is.  And the other part of what I am looking for here is

                                                        39
are they issues of local interest or specifically local
issues?
          In other words, if drinking and driving is an
issue of local interest, would the running of a spot
having to do with drinking and driving qualify under that? 
Or was it this broadly phrased, I guess is my question.
          MR. MCINTURFF:  Let me read you the exact
question as it is in the survey.  And I think these are
provided as part of the material that you have.
          MR. MASUR:  Yes.
          MR. MCINTURFF:  It says, of the number of PSA's
that your station runs, what percentage are locally
produced or about local issues?  And so yes, it was that
broadly phrased.  I understand the point.
          I can say, good humoredly, that the best
questionnaires I write are after I have gotten the results
from the first one.
          (Laughter.)
          MR. MCINTURFF:  And there is no question -- and
that is a good example, where when you look at the first
4,000 interviews, you kind of kick yourself and say, that
is something I could have written better, done better,
better understood.  I think there is a good chance NAB
will continue this in future years.  And I entirely accept
that by combining locally produced and local issues we

                                                        40
could be getting two different measurements.  And the
question was not the best way to do the question.
          MR. MASUR:  That is okay.  And I do not mean
this as a criticism.  I was just trying to understand it
for myself.
          The other one, just very quickly.  When you
inquired how many PSA's does your station run in a typical
day, did you consider at all exploring what times of day
the PSA's were run?  Because the most common complaint
that I hear about PSA's is that they are buried in times
when nobody is watching.
          MR. MCINTURFF:  Yes, one of the reasons we asked
for the run of station rate, in terms of calculating this
average, is, one, it is because stations have it on the
rate card.  And as a researcher, what you want to do is do
something by a standard methodology that could be
replicated by others and get the same results.  And so I
used run of station for that reason, because it is
commercially available, blah, blah, blah.
          We did ask information about when the ads ran. 
I broke the day into four rough day parts.  And across
this respondent base, what they said was that roughly a
quarter was run at each different section of the day.  And
so, in some ways, it validates the use of the run of the
station rate, because indeed there was some equal

                                                        41
distribution of the time period in which these were run.
          MR. MASUR:  Thank you.
          MR. LACAMERA:  Did I miss someone on this side?
          Robert.
          MR. DECHERD:  Bill, I was interested in Frank's
question, and I wanted to stay with that for a second.  Is
there any adverse impact on the quality of the data or the
findings to have encouraged people to respond, to really
work it, to get this kind of response rate from a pure
research standpoint?  That is my first question.  I have
got another one.
          MR. MCINTURFF:  The answer to that is that, you
know, research like any other kind of research, is a
science.  And if I had my druthers, I made a very
deliberate choice.  It is a choice that I think most
researchers would also defend and make.  And that choice
is, what would I rather be doing here today?  Would I
rather be sitting here today, talking about how I took a
42-percent response rate and tried to do national
projections from a known database?  Or would I be happier
that we had done a survey of 1,000 stations, taken 500,
and tried to make national projections?
          And as a very deliberate decision -- and I think
it is a defensible one -- what I said was, if we are going
to make national projections from this data, I will not

                                                        42
make national projections unless the response rates are
high enough that I feel comfortable that we are over the
hurdle of being able to answer Norm or anyone's question
that, how do you know that the people who answered the
survey are like the people who did not answer the survey?
          And if the response rate had been 18 percent,
the concern that would have been indicated, about how can
you take an 18-percent response rate and project to
thousands of other stations, would have, I think, been
very substantial.  And so I think that and I feel
comfortable that the better decision between two not great
options was to try to increase an encourage higher
response rates.
          MR. DECHERD:  Okay.  Well, let me stay with
that, and then come to a second question.
          Is there anything inherently wrong with a survey
or the party engaging in the survey to send a letter of
encouragement or describe one of many reasons why the data
is being collected and then do follow-up calls to get that
kind of a response?  I mean it would seem to me that is a
plus.
          MR. MCINTURFF:  No.  Again, if you are a
commercial researcher and are going to do a mail
questionnaires, they usually send a dollar with it.  They
send a follow-up letter.  They do calls.  These are

                                                        43
standard research methodologies to increase response rates
that are around this trouble.
          And the other thing I would indicate is --
because you can say, well, why didn't you like call them
on the telephone?  Daytime telephone interviewing times
4,000 calls is enormous.  This was a large and expensive
project.  But if I had done a telephone methodology, it
would have been by another order of magnitude expensive.
          And, number two, these are not questions that
can be answered on the telephone.
          MR. DECHERD:  Right.
          MR. MCINTURFF:  These require somebody to sit
down and collect information inside the station.  And the
reason you have to do this -- and my argument was we have
got to do this in a mail format -- was because, despite
however well intentioned you are, people do not collect
this information in this format at a station.  And so they
cannot sit there and answer a questioner on the telephone. 
So you had to do this in this format.
          But, no, as I said, in the best of all possible
worlds, when you do a census survey, we would have 100
percent response rates.  But we do not.  But, again, I
would like to just focus on how unusual it is to be able
to do an entire census of an industry and how unusual it
is that we have this kind of quality database to work

                                                        44
from, to try to at least make stable numbers that are
projective and predictive.
          MR. DECHERD:  Let me go from that point to the
next question, which is partly a statement.  One reason I
was very pleased to see NAB take on this project and have
the degree of effort and financial commitment to do is the
point some of us on this panel have made from beginning. 
Which is there is a large and representative group of
broadcasters who are doing these kinds of things on a
routine basis, for a variety of reasons.  Some of them, as
we heard earlier, are for enlightened self-interest.  Some
are because they believe absolutely in a cause.  And so
forth and so on.
          But the fact is that they are doing this.  And
when we then think about why there is so much cynicism
about what we do, or what people believe we do not do, I
think it really has to do with the fact that the
information has not been out there.  So if I now, at the
risk of offending all the survey methodology, went to
Richard's next question, which is a commonly asked
question:  Great.  Well, you say you do all this, but you
bury it at 5:00 a.m.
          All right.  Well, if we took your data there,
and let's say it is evenly distributed among four parts of
the day, broken down by hours, and then you work through

                                                        45
almost any of these other issues -- challenge the margin
of error, challenge the extrapolations of non-respondents,
challenge large-market versus small-market differences,
back out political time altogether and just come down to a
public service number or a charities number, let's call
it, even if you took an excessive kind of -- I will call
it -- skepticism about that, aren't we talking about these
numbers still being enormous?
          MR. MCINTURFF:  Yes.
          MR. DECHERD:  I mean, we are talking about $4.5
billion and it might be $3 billion a year.  I mean, it is
excessive.
          MR. MCINTURFF:  Yes, I believe that all of these
are legitimate concerns that could be raised about any
research.  But the point is we have some hard numbers. 
And that is we have almost 4,000 people who responded to
the survey, who told us how much they helped raise in a
community.  That adds up to a specific figure in this
survey data, and it is a billion dollars.
          Okay, so you say, well, you projected it to be
$2 billion.  So, do I believe those projections are
defensible?  Absolutely.  Or I would not be here in public
talking about them.
          But at the worst-case scenario, the worst-case
scenario is, hey, you have 4,000 people who told you they

                                                        46
helped raise a billion dollars around the country.  So, at
a minimum -- you know, and we get to over-reports and all
that kind of stuff -- you know, what you are talking about
is an enormous amount of fundraising activity being
leverage by local broadcasters.
          So I do think that, as a database, whatever
these range of concerns, that it would at least contribute
in a thoughtful discussion about options, research
projections, that these are still very, very large numbers
that reflect a substantial amount of activity by local
broadcasters around the country.
          MR. DECHERD:  Thank you.
          MR. LACAMERA:  Les, did you have a question?
          MR. MOONVES:  Yes, I have a couple of questions.
          Bill, obviously you have done a considerable
amount of research, and this is an opportunity for the NAB
to state all the good things that are coming out
throughout the country that may be under-appreciated and
under-recognized.  Has there been any study, you know,
coming from the network side -- forget about the PSA's for
a moment; I want to talk about programming -- has there
been any studies about the amount of programming that is
done by the networks in dealing with issue-oriented pieces
that are in fact PSA's that last more than 1 minute that
are not on at 5:00 a.m., that are an episode of ER that

                                                        47
deals with AIDS, an episode of Murphy Brown that deals
with breast cancer?  Have there been any studies about
that done?
          MR. MCINTURFF:  Again, possibly they have.  But
those are not -- that is beyond the scope of what we did
in this research.  And that is beyond the scope of the
research, because, again, what I argued for in the design
and the methodology was that we should, for a number,
create a number that met these standards, definitive,
trackable, projectable, something you could replicate by
other research, and then hard enough dollar figures that,
again, reasonable people would agree that you can use that
figure.
          And so the trouble that I would have with what
you have described is it would get us into the
conversation about how do you value an ER episode in terms
of that kind of stuff.  And so what I argued for is I
thought that the numbers were going to be substantial, and
they would indicate this enormous amount of contribution,
and that I would rather have that as a dialogue than
trying to add to it a fuzzy number that is much harder and
you would have reasonable having much more of a dispute
about how that number was calculated.
          MR. MOONVES:  Got it.
          And, Jack, let me ask you a question if I may.

                                                        48
          In terms of this survey, did NAB lean on
anybody?  Does it lean on anybody?  Is there an active
participation on the part of the NAB to get the stations,
the local stations, which you represent, to do more in
terms of this area?
          MR. GOODMAN:  I think in terms of this, Les,
Bill already answered the question.  There was an effort
to go back, primarily by the State associations, to
encourage people to answer the survey, to get the results
up to where they would be meaningful and quantifiable.  In
terms of other things, NAB does a number of things.  We
run a number of national service campaigns which
distribute PSA's once a month, actually, to stations,
which they can run, on issues like alcohol, drunk driving,
get out the vote -- any number of issues that we have had
over the years -- that we do that.
          We also recognize in various ways quality in
broadcasting.  We had, at our convention last week, in the
radio industry, we give out Crystal Awards to some of the
best public service programming in radio across the
country.  We have service to children awards that we do
every year, in October, in Washington, where we recognize
the best local children's programming across the country. 
There are any of the number of things that we do to
encourage and recognize stations.

                                                        49
          But in the sense of do you say, you should do
this minimum, no.  We take the same view that the FCC has
taken, that that is a matter for stations to decide in
response to what they perceive to be their community
interest.
          MR. MOONVES:  Thank you.
          MR. LACAMERA:  Frank.
          MR. CRUZ:  Bill, a couple of questions.  Did
your survey or your study take into account public
broadcasting at all?
          MR. MCINTURFF:  No.  These are commercial
broadcasts.
          MR. CRUZ:  Just strictly commercial.
          MR. MCINTURFF:  Commercial broadcast.
          MR. CRUZ:  The second question pertains to news. 
In your assessment of that particular area as one of the
issues that led to your calculation of figures pertaining
to areas of important topics to be covered, in calculating
the numerical dollar figure for local news programming,
did you balance or factor into that equation the amount of
money that was being spent by advertisers during that hour
program?
          MR. MCINTURFF:  No, sir.
          MR. CRUZ:  So, in other words, you reached a
total figure on the value of local news without balancing

                                                        50
that?
          MR. MCINTURFF:  No, sir.  There is no
calculation about the dollar value of local news.  That
was exempted.  That is not counted as part of this number.
          MR. CRUZ:  Oh, I see.  Okay.
          MR. MCINTURFF:  So there is nothing in this
number that has anything to do with local news programming
provided by local radio and TV broadcasters.  This is
simply three things:  the value of PSA's, the amount of
money that is raised by charities that are leveraged and
supported by these stations, and the value of the free
political debate candidate time.  That is it.
          MR. CRUZ:  Okay.
          MR. MCINTURFF:  There is no dollar value here
about trying to calculate the value of local news.
          MR. CRUZ:  Yet one of your graphs here shows
that TV news segments include X amount of topics of
issues, about AIDS and so forth.
          MR. MCINTURFF:  Yes, sir.  What I was saying is
we document it or we talked about, did you cover these
stories, but we did not try to provide an economic value
to that news coverage.
          MR. CRUZ:  Okay.
          MR. MCINTURFF:  And, again, this gets back to my
earlier conversation with Les, I just felt that I did not

                                                        51
want to open the terrain of how do you try to provide a
dollar value for news coverage, and then open up the
discussion about responsibilities of broadcasters to
provide it.
          That is why I did not do kids programming. 
There is a requirement for 3 hours a week of kids
programming.  I did not want to, as a researcher, say we
should calculate a value about something that is a
requirement.  I just felt that we would be opening
yourself very legitimate philosophical discussions about
how can you provide a dollar value?  And so you would get
credit for doing a dollar value for something you are
required to do.
          So, for news programming, for public affairs
segments that were not connected with candidates, for kids
programming, for preempting your shows to go to weather
emergencies, all of that stuff there is zero here in terms
of this calculation in terms of trying to provide a dollar
value.  Of course, again, I think the NAB would argue that
those are all things that contribute to community service. 
But, again, I did not want to get into the legitimate
discussion about trying to put a dollar figure on those
activities.
          MR. CRUZ:  Okay.
          MR. LACAMERA:  Shelby.

                                                        52
          MS. SCOTT:  Does your measurement show in any
way, in especially the larger markets, which candidates
were getting the air time?  Did it get down to, like, the
city councils, the school committees, the school boards,
the town councils?  Or was it basically statewide and
Federal elections?
          MR. MCINTURFF:  I do not have that in the
quantified data.  We have some feel for that in the 500
qualitative interviews we do.  And I would say, in the
major urban markets, based on what they described as
activities, it would indicate that we are talking about
statewide and Federal.  I think Federal does include
congressional level.  For the major markets, in terms of
the interviews we did on the qualitative side, there is
little to indicate in the major urban market that below
the Federal or statewide level there is substantial
candidate time being offered.
          MS. SCOTT:  In other words, if you want to call
them lesser offices -- some think the school boards and
the city councils are very important offices -- do not get
the air time and sometimes cannot even buy the air time.
          MR. MCINTURFF:  Well, again, we are talking --
again, remember, let's talk about -- when you say get the
air time, let's be very specific.  We are talking about
free time being offered for candidate debate and forums. 

                                                        53
And so in that limited, restricted question, again -- and
this is just a qualitative feel -- times those urban
markets, I would say that, again, the focus is primarily
on Federal and statewide races.  It does not mean in every
market.  It does not mean every situation.  But I think
that is a characterization that would be supported by
those qualitative interviews.
          Bob Kobeck, I know you are very familiar with
the qualitative database.  Are you uncomfortable with what
I have said?
          MR. KOBECK:  No.
          MR. MCINTURFF:  Okay.  Thank you.
          MS. SCOTT:  Besides the so-called popular
problems of the year or the month, like AIDS or drunk
driving or homelessness, is there any measurement of
really local community issues being covered, other than
what is the popular thing of the month this year?
          MR. MCINTURFF:  Yes.  And I do not want to turn
your question.  But, for example, one of the other
research projects that I have been involved in was the
safety campaign to make sure children are not put into
safety seats in front of passenger-side air bags.  And I
can tell you that since August of 1996, there was an
extraordinary shift in behavior in the American public,
where people are placing their children in the back seat

                                                        54
and out of harm's way.
          And there is no question -- so, what I am saying
is, is that a popular item?  Yes.  But it is a popular
item that has saved children's lives.  And the number of
kids that were killed in 1997 is much lower than the kids
that were killed in 1996.
          So, one, I think we should recognize that
popular stuff changes behavior and saves lives.  And that
is one thing I have been involved in, where I can give you
the actual tracking numbers.
          But, number two, to answer your question
specifically, we did open-ended questions, where we asked
the stations, you know, what exactly did you do?  Who did
you sponsor?  And the reason that it is hard, because it
is an open-ended question and we have all these thousands
of things we have to read through, is that when you read
through those, that in addition, as you said, to these
kind of well-known, popular causes, there is clearly
multiple stations, multiple markets all over the country,
where they are doing something that is very indigenous,
very local and something of real concern specifically to
that area.
          I think you see it most in terms of the
post-emergency response, in terms of where they talk about
very local stuff.  And what you also see a lot in the open

                                                        55
ends is, you know, some fires and schools, where they do
something specifically for some bad thing that has
happened to some community kind of icon like that.
          But it is hard for me to give you the exact
number, because it is just open ended.  In the report that
you have, I have tried to summarize, across all these
interviews, like the top 15 or 20 charities that are being
assisted -- as you just look at the frequency of response,
so that you have an idea of the range that is being done
out there.
          MS. SCOTT:  Thank you.
          MR. SUNSTEIN:  This kind of goes a little far
afield from your particular survey, but I would ask both
of you.  Anecdotally it is said that the content of local
news has shifted towards sensationalism and kind of
attention-grabbing materials, and away from hard news.  Do
you have any data on that?
          MR. MCINTURFF:  I do not in the course of this
study.
          MR. SUNSTEIN:  Do you have any data on that? 
That is the question.
          MR. MCINTURFF:  No.  No, I do not.
          MR. GOODMAN:  There are various studies which
have shown various things.  I think one of the
difficulties with this is if we were talking about the

                                                        56
content of local news and the notion that the government
would have any indication as to what would be the content
of local news, that is as about the core of the first
amendment as I think you could possibly come.
          MR. SUNSTEIN:  This is a completely empirical
question.
          MR. GOODMAN:  I understand that.  It is a
difficult thing, also, to measure, and precisely why
people do certain things, what is the most important to
communities, I think you have stations that are responsive
to what people want to see.
          MR. SUNSTEIN:  As I said, this is really an
empirical question.
          MR. GOODMAN:  I do not know.  The answer is
there is only very sketchy data about what the particular
content of local news is.  And it is often subject to a
great deal of methodological problems because of what the
nature of the particular interest of the community might
have been and what else was going on.
          MR. BENTON:  I want to ask two questions on
public service announcements and the other on local and
public affairs programming, but before asking those
questions I think it is terrific that the NAB has done
this, and we can argue with the details, and there are
some arguments to be made here, but the fact that the NAB

                                                        57
is focusing on this is perhaps one of the results and
outcomes of the fact that we have this commission, because
this is not usually the kind of thing NAB is focusing on,
and it is wonderful that the survey was done.
          Going back to, I think, a point that Jim made at
an earlier meeting, the NAB code was thrown out by the
court at some point, and maybe you can talk about this a
little further, but in addition to Government regulation
the notion of the NAB taking some leadership in self-
regulation and not the lowest common denominator but
carrying on with the best examples of service here, as
opposed to what is the least that can be done to meet
these obligations is a big challenge, I think, for NAB and
hopefully you will continue on this.
          I want to turn now to the public service
announcement and follow up on Richard Masur's question,
because I think of the $1.2 billion in PSA's that have
been shown on local television.  The issue of when they
are shown is crucial, and you said in your answer that
there was -- roughly during each quarter that it is an
even distribution.
          Competitive Media Reporting, which is a firm
that tracks ad spending, was quoted in the Wall Street
Journal in September '87 as saying 80 percent of the PSA's
are shown in what we would call the graveyard shift

                                                        58
between 11:00 p.m. and 7:00 a.m., when it is likely that
not many are watching and it is likely most broadcasters
are not selling the available air time, and so if that's
true, I mean, first is that true, and I would be
interested in having your comment on the competitive media
reporting in comparison to what you said.
          MR. McINTURFF:  What I'm suggesting is the
respondents to this survey gave different figures as to
when they said the PSA's were running, and they did not at
all indicate, they did not come anywhere close to
replicating 80 percent were between 11:00 p.m. to 6:00
a.m., so in terms of what is reported by the people who
fill out the surveys, there's no number that would be even
close to that number in what they reported in terms of
their PSA activities.
          MR. BENTON:  There's obviously a gap in the
competitive media reporting statistic and this.
          The other point about the PSA's is the value,
and you've pegged a value on the average cost of local TV
and radio commercials, but really aren't you losing on
sold time in most cases, and what is the value of a spot
that is really an unsold spot?
          I mean, is it reasonable to put this as a value,
and do you use, for example, spots on the Super Bowl and
Seinfeld to make these averages?  How do you determine

                                                        59
these averages?
          MR. McINTURFF:  The average was determined by
the run of station rate from the rate card of the station. 
We asked them to provide us, what is the run of station
rate, and that way we treated this PSA advertising the way
a commercial advertiser would run station advertising. 
That's a rate that is provided.
          I can again say late in a campaign we have
bought spots, and we get lower -- not at Federal, but we
have bought spots, we've run a station just to kind of up-
volume, and as I said, I did that rate because it is
available, it is replicable, it is something that if
anybody else in this room conducted the survey they could
get, and the numbers would come close to matching ours
because we would have the same number.
          And again, whether or not that leads to
discussion about whether that's the fairest rate to
measure, that is, I am sure, a legitimate discussion.
          As a researcher I am comfortable your objective
is to get a number that others can replicate because
you've used a number that is publicly available and that
can be replicated, and that's the reason I asked for and
used the run of station rate for a 30-second spot.
          MR. MOONVES:  Charles, you won't see a PSA
during the Super Bowl, you can be sure of that.

                                                        60
          MR. BENTON:  No. It's just how do we determine
the averages?
          MR. DUHAMEL:  Well, if I can interrupt you, I
just took the figures that were presented there and
divided it out with my calculator, and it is 986,000.9 per
station, and 137 spots a week.  You multiply 137 by 52 and
divide it out, and the average spot rate is $136.  Nation-
wide that is a cheap rate.
          Now, for us, our OS rate is cheaper than $136,
but we're in a small market, so I mean, they're not using
$5,000 a spot. If you take the figures and divide it out
it's $136 a spot, which I think is a very reasonable
national rate.
          MR. McINTURFF:  It's $136 for TV and $36 for
radio.  Again, and this get's to Norm's question in terms
of the range, because we have, being as we should, we have
hundreds of people who are serving markets of less than
25,000 people, and very small markets, and when you do an
aggregate mean, you knock down the New Yorks, L.A.'s of
the world times hundreds of stations in these small
markets.
          And so Bill's math is exactly right.  That's
exactly what we reported, which is that their average run
of station TV spot we used $136 and $63 for radio.
          MR. BENTON:  The point is -- and we need to move

                                                        61
on.  I have one more question I want to ask about local
programming.  The point is that if the majority of the
time is in this fringe period of late night and early
morning, very early morning time, then the value of that
time, especially if it's unsold, I think those numbers
need to be questioned, that's all.  That's the point. 
That's the basic point.
          Now, on public affairs, local public affairs
programming, just one point.  There have been several
recent surveys about local news, of which in a half-hour
of local news crime represents 29 percent of the coverage,
sports and weather, commercials, and when you're finished
with crime, sports, weather and commercials you've got a
very small amount for everything else.
          And I think that is -- I think there are
different studies that have been done by various
foundations, and it would be very interesting for the NAB
to have a look at those studies and see if, in
collaboration with the news association, one could really
take a look at local news and the dissatisfaction many
people have with it, because the body count, we need more
of a body count anyway.  Just a comment.
          And I want to get back to my point about public
affairs programming.  We did, the Benton Foundation did a
study which was sent out to the committee, a 2-week survey

                                                        62
of five U.S. markets, Chicago, Phoenix, Nashville,
Spokane, and Bangor to try to quantify the amount of local
public affairs programming, the survey also covering
commercial stations that do any local news.
          This was done in the last week of February and
the first week of March.  There were 40 stations in these
five markets.  The survey was of the online and hard copy
programming guides with follow-up interviews with the
television station staff, so we went in-depth in these
stations, and the reason why was because providing
programming that is responsive to important local issues
is not the only public obligation, but it's the central
obligation.
          Okay, findings in the five markets combined, 40
commercial broadcasters provide 13,250 total hours of
programming, and just .35 percent, or one-third of 1
percent, 46.5 hours was devoted to local public affairs in
that 2-week period.
          In the three markets, Nashville, Tennessee,
Spokane, Washington, and Bangor, Maine not one commercial
station aired any local public affairs programming during
this period.  35 percent of the stations surveyed provide
no local news at all, and 25 percent offer neither local
nor public affairs programming or local news.
          A total of 2 hours of local public affairs

                                                        63
programming was available between 6:00 p.m. and midnight,
and that means of the 46.5 hours, 44.5 hours was in other
times, and so it comes back to the point about when either
programming or PSA's are shown as being the heart of the
matter, because that's when the audience is there, and so
just two stations aired any local public affairs time
during the prime time period.
          So this study has been distributed to the
committee, and in line with your survey I would love to
have your comments and reactions to that.
          MR. McINTURFF:  I have a few comments.  One I
think, what the NAB survey does is to broaden the
definition as it should, I think, for what is meant by
community service to look at the wide range of activities
where local broadcasters are contributing like PSA's and
like the charitable and other kinds of on-air sponsorship
that they provide.
          Number 2, this committee I think has fairly and
appropriately raised questions about what we did and its
methodology and I think those are the exact questions that
should have been expected, and we're happy to try to deal
with them.  No research is perfect.
          I would make an argument that the 1-year time
window is better than 2-week time window.  A 1-year time
window also gets around being in the late sweeps.

                                                        64
          And the other thing we're hearing from stations
in these qualitative interviews is, what we heard from
these stations was, look, our public affairs programming
is shifting from half-hour blocks, because what's changed
in the last 10 or 20 years is the news programming is not
a half-hour a day or twice a day.  It's like these
enormous blocks from 4:00 to 6:00 and 5:00 to 7:00, and
within those blocks they're offering 3 to 7-minute clips
that sponsor a charity, sponsor a local group, et cetera,
et cetera.
          And their argument in these qualitative
interviews we did is that the response rates, the
viewership, and the reaction to those kinds of public
affairs segments are stronger, more watched, and more
reactive, more responsive than blocking out large chunks
of time that may or may not be seen, which is why, and
again when you ask for a response on our survey, again and
again, what's the point.
          I did not provide a dollar measurement for this
activity because I did not want to engage in the debate
about how you value this activity, but that's why in the
data you've been given we tracked in the last year did you
run a public affairs segment at least once on each of
these topics, and although, again, they might be the
popular cause of the day, the response rates are very,

                                                        65
very high across stations, where within that time window
of very long period of time of a month -- I'm sorry, a
year -- that we have three-quarters or more of these
stations doing public affairs segments about AIDS,
homelessness, hunger, and lots of these issues.
          And the other thing I would say lastly is, given
the constraints of the methodology that was used for that
study, and given the constraints of that study, I don't
see them as necessarily in opposition.  Both things can
stand as independent, true facts, and so in the same way
that I would say that about the study you've described, I
would say that about the work that we have completed.
          MR. GOODMAN:  Can I add a few things to what
Charles has raised?  Just one piece of information,
Charles, I know you referred to the NAB code, and it has
perhaps not been mentioned here very much that the NAB in
fact has a statement of programming principles which has
been in effect for some years, which is, in fact, more
detailed about programming issues than the NAB code ever
was.
          What it does not have is the advertising
restrictions that were the focus of the Antitrust
Division's effort to eliminate the code, so to the extent
that the code dealt with programming there is an NAB
statement which does the same, and perhaps in more detail.

                                                        66
          MR. ORNSTEIN:  Jack, can you get us all a copy
of that?
          MR. GOODMAN:  I'd be happy to.  I don't know if
I can do it today, but will be happy to supply that to the
committee.
          The second thing is, there are a couple of other
measures of PSA's that I think are interesting.  Last
week, the Ad Council, which represents national PSA
campaigns, announced its figures for last year, which
showed enormous increases for radio and TV stations in
terms of the time donated and that is time that the Ad
Council values.
          Charles, with respect to the unsold spots,
that's an interesting question.  I don't know.  There's no
way to tell exactly what each spot is.
          But I do know many stations, particularly radio
stations, have deals for unsold spots, so there is no such
thing as an actual unsold spot because they have
advertisers of last resort.
          And so to the extent that they are using these
ads instead of giving them -- or, for example, barter
deals are frequently for unsold spots, where the station
trades merchandise for time.
          To the extent that they're using these for
public service announcements, they are therefore taking

                                                        67
ads which they could have used for other purposes and are
not, so I'm not sure that there is any particular value,
and any way to really value how much of something is sold.
          With respect to a couple of points you made
about the survey that you and the Media Access Project
did, one thing I think it has, it proceeds under an
assumption that I think is not altogether clear, which is
that local public affairs programming is the central part
of the public interest obligation.
          If that was ever true, and the commission never,
ever required that, that is certainly has been untrue for
the last 20 years.  The commission has said, and it has
said over and over again, that they think it is better for
stations to decide how they should serve the interests of
their community, that one size does not fit all, and that
where you have seven, eight, nine stations in a community
you have different answers, and different solutions.
          For example, some of the stations that you found
did not do news are Christian stations.  They have a
religious mission, they do not have a news mission, and
the commission has found that to be in the public interest
to have religious stations.
          Some of them are stations that are frankly
struggling.  They are stations that are affiliates of the
UPN, WB.  Often they are start-up stations.  They are

                                                        68
stations that, frankly, until a few years ago were losing
money and perhaps moribund.
          You will find -- and I don't know if any of the
markets you surveyed are among these -- that there are
markets where those stations are now doing substantial
amounts of local news and public affairs programming.
          They are typically markets where they are now
operating under a local marketing agreement, where they
are operating in conjunction with another station in that
market, which some believe reduces diversity but we
believe the evidence -- and this is evidence that the FCC
has collected -- on these stations are a way of providing
far more local programming and local news programming.
          For example, in Cleveland there are two stations
in a local marketing agreement.  Collectively they had
1 hour of news per day, and very poor quality news, before
the local marketing agreement.  Collectively they now do
6 hours of news per day, and one of the two stations
focuses on longer segment public affairs type news
programming, and the other is more hard-hitting.
          So there are a number of things.
          The other statistic that I think again,
relatively hard evidence that you can look at, is the
expenditure by stations on news.  The NAB and the
Broadcast Cable Financial Management Association does an

                                                        69
annual survey on news expense, on the financial
performance of television stations, and one of the
questions is how much is spent on news programming, and
the numbers are quite high.
          They have gone up dramatically every year in
this decade, from 1990 to 1996, in all categories of
stations, and they have gone up larger than station
revenues have gone up, so the proportion of station
expenses devoted to news continues to increase.
          But what is most interesting is that the largest
increases are not in the network, the affiliates of the
three traditional networks.  They are in Fox stations,
which have -- their expenditures on average from 1990 to
1996 more than tripled, and for independent stations not
affiliated with any of the four networks, their
expenditures in this category more than doubled in that 7-
year period.
          So I think you are seeing -- and again, it's
what the FCC thought when it decided to deregulate, that
stations would respond to marketplace needs and would
provide programming that was responsive to their
communities.
          MR. BENTON:  Just one quick comment in the
three, because we picked these markets to represent large,
medium and small, and in the three medium and small

                                                        70
markets, to find that not one commercial station aired any
local public affairs programming was to us quite a
surprise.
          I think Gigi, maybe you could comment on the
centrality of -- 
          MR. DUHAMEL:  Can I respond for the small market
stations?  You raised this point.  Let me tell you about
our station.  When I came to the station in 1967, we
started about 1968 a new public affairs program.  Through
the years, the ratings dropped to about a 2 rating.  This
is the noon hour.
          And so about 8 or 9 years ago we turned that
over to the news department.  We start with local news. 
We follow up with local weather, and the ag markets, and
then embedded in that there are two 5-minute interviews
that the news department runs, and these would be
classified as public affairs.  I am positive if you
surveyed the Rapid City market you would say that is not a
public affairs program.
          Now, the ratings now are between a 9 and a 10,
so now we do not have a half-hour, but I'm telling you we
have a significantly higher viewing and significantly
bigger impact with two 5-minute interviews Monday through
Friday than we do running a half-hour that nobody watches,
and we're in a competitive environment, and the people

                                                        71
have alternatives.  There is 50 channels on the cable
system and we just can't sit there and say goodbye,
because they don't come back.
          And so we have not figured out, how do we
improvise, and I think -- I can't believe in these major
markets, when you say there's no public affairs -- there
may be no 30-minute public affairs programs, or when you
talk about national with no public affairs, if I'd had
enough time, and say -- I got this on Good Friday and then
I had to fly out to get here, but if I had enough time I
would have called those Nashville stations and asked them. 
I can't believe that they do no public affairs in
Nashville, or in Bangor, or in Spokane.
          And that is what I am talking about by local,
because we're doing local.  I mean, we carry national from
the networks.
          MR. MOONVES:  Charles, can I ask a couple of
questions from the survey that you're quoting, and a) this
was done between the Benton Foundation and Gigi.  Okay. 
When was this done?
          MR. BENTON:  Instead of my trying to answer
this, the author is here right in the room.  Kevin, can
you answer these questions specifically?
          MR. MOONVES:  When was this survey done?
          MR. TAGLANG:  It was done during the last week

                                                        72
in February and the first week in March.
          MR. MOONVES:  So it was done after the formation
of our group, and the two of your organizations, why was
it decided, a) these markets, and why was it limited to
2 weeks right at the end of a sweeps period that, by the
way, was a rather unusual sweeps because of the Olympic
Games?  Was there any rationale to this?
          MR. TAGLANG:  The rationale for choosing this
time was, we wanted it to be as normal a 2 weeks as we
could find, so we wanted to find nonsweeps after the
Olympics and before the college basketball tournament.
          MR. LACAMERA:  10 of your 14 days was in the
February sweep.
          MR. MOONVES:  So that is an unusual -- and once
again, because of the Olympic Games.
          MR. GOODMAN:  From what we understand -- we've
just begun to look at some of these figures as well.  We
understand that actually less -- for your station in
Chicago, as I understand it, there is some more regularly
scheduled local public affairs, but because of sports
preemptions and the requirement of running 3 hours of
children's programming, that the local public affairs
program was preempted so that they could meet their 3-
hour children's obligation because of the timing of some
sports events.

                                                        73
          MS. CHARREN:  Terrific.  You can always blame it
on the kids.
          (Laughter.)
          MR. GOODMAN:  I'm not blaming, just merely
saying this 2-week period is perhaps not usual, and that
is the difficulty with looking at any particular short
span, as opposed to a much longer period.
          MR. MOONVES:  Also, once again, why were these
specific five markets chosen?
          MR. TAGLANG:  The markets were chosen somewhat
at random, but we wanted to diversify by market size as
well as geographically.
          MR. MOONVES:  Charles, when you mentioned that
6:00 to 11:00, are you aware that 8:00 to 11:00 is prime
time, and that there never in the history of media have
been public affairs programming during this time?
          MR. TAGLANG:  May I answer that?  What he was
quoting was sort of an extended prime time as well.  It
was from about 6:00 p.m. until about 11:00 p.m., so it
includes supposedly local access time.
          MR. MOONVES:  And was the purpose of this study,
since it was done by two members within this group, with
their organizations, specifically to help us in our
deliberations, or was there another purpose to it?
          MR. BENTON:  No.  The specific purpose was to

                                                        74
try to get the facts about local public affairs
broadcasting.
          MR. MOONVES:  In 2 weeks, in a 2-week period of
time?
          MR. BENTON:  In a 2-week period of time.  We're
not the NAB.  We don't have the resources.  We were
pressed to do this, and we did the best we could, on a
sampling basis, to get the facts, tell the truth, do the
survey, look at the record, call the stations -- I mean,
it's a big job, and so we're trying to get the facts.
          MR. MOONVES:  I must make a comment.  I find it
odd that two members of this organization decided to do a
splinter survey on their own to address what we're trying
to do here, but that's your prerogative.
          MR. TAGLANG:  I would like to say Charles Benton
is the chairman of my board, but he's not involved in the
day-to-day decisions.  The decision to do this rests with
my boss, the director of my project.  It didn't rest with
Charles, so I don't think we have to defend what we're
doing here, Les.  My organization paid with a big
$250,000, and Charles, which is not much more, does not
have the resources to undertake it.
          I do think, for the purpose of debate, it was
worthwhile to have something other than what we have seen
today, and I would like to talk about my concerns with the

                                                        75
survey at some point.
          MR. MOONVES:  I'm sure you will.
          MS. SOHN:  Frankly, I'm offended by your
attacking us.  Yes, we want to see more local programming,
and that's what this is all about, and frankly, that is
where I think your survey fails, so I think we should -- 
          MR. MOONVES:  This isn't my survey.
          MS. SOHN:  Well, it's the NAB survey, and
they're NAB members on here, so if you want to call our
survey a splinter survey, the NAB survey a splinter
survey, I have no problem with their survey.  I think it
is great what broadcasters are doing.  I think it is
completely legitimate that they did it.
          But it is equally legitimate that some other
public interest members of this advisory committee did it
as well, and if you want to attack the methodology, if you
want to attack the numbers, that's fine, but don't attack
our organizations.  That I really have to take personal
offense to.
          MR. MOONVES:  Noted.
          MR. LACAMERA:  Jim.
          MR. GOODMON:  Just a couple of comments.  I
really think it's great that the NAB is doing this, and I
hope, Jack and Chuck, that we continue to do this, and as
we continue to do this, we can work on methodology and

                                                        76
what we're asking, and refine what it is we want, and I
think this will get better, and I think this reminded us
of some things we ought to be thinking about.
          I mean, this is a good project, and my own
notion is really that this understates to some extent what
broadcasters are doing, but this does not get to the point
that I think we're always going to have that some don't --
I wouldn't say don't do anything, but don't, and I will
remind you that I handed out, passed out at the first
meeting the NAB code.
          And I really like that notion of broadcasters
getting together and coming up with some minimum standards
in all of these areas that we can all talk about, no
matter who is doing the survey, and come to some
reasonable conclusion as to what our minimum standard
should be.
          I would hope that we can get -- I know the
Congress has said that they are interested in legislation
that would allow broadcasters to do that, since the court
threw it out before, and I'm just pitching for that again.
          Now, I also have no problem with the notion of
minimum standards in the regulations. I think that makes a
lot of sense.  I can't talk anybody else into that, but it
seems to me that we should establish a minimum level of
expectations for all broadcasters in terms of how they

                                                        77
serve the community and how they operate in their
communities, and that that notion make sense.
          And I appreciate the NAB's survey. I appreciate
everybody's survey.  We need all the information we can to
get through this.  I just want to pitch for the NAB code
and minimum standards.
          MR. LACAMERA:  Lois.
          MS. WHITE:  Mr. McInturff, you've already
answered my question but I'm going to ask it anyway.  It
has to do with children's programming, and I notice that
in one of your PSA's you did identify, I think it was
tobacco, Voices Against Tobacco, where that was a
children's focus, and your two columns, one with the
measurables and one with the nonmeasurables you had, and I
assume -- it said Kids Vid.  I just assume that was
cartoons, but that was your identification for children's
programming.
          MR. McINTURFF:  We ran out of space on this
slide.  That's a 3-hour requirement for children's
programming, and again, when this was being discussed,
what would you place a dollar value on, I said again, I
just felt very strongly you should not place a dollar
value on something that is a requirement for you to do, so
that it should not be measured in the scope of this survey
as an activity.

                                                        78
          MS. WHITE:  Just in your opinion, had you asked
that, do you think the stations would have given you an
honest answer, or would they have lied and said yes, we've
got it?
          MR. McINTURFF:  Whenever you do a mail
questionnaire you're dependent upon the respondents to
provide good faith estimates and you have to operate on
the assumption of good faith.
          I can answer Norm's question -- it's the same
kind of question, which is what's the range of activity. 
We had one station in L.A. tell us they did $9.8 million
in terms of charitable fundraising.  We had another
station, I forget the small rural station, who said --
they put down $3.  That's a range.
          But the only thing I did do -- 
          MR. ORNSTEIN:  Was that all at one place, or
spread around?
          (Laughter.)
          MR. McINTURFF:  But the other thing we do in
terms of means and medians, you will also note that on the
four networks we provided a median figure of PSA's,
because the range within the four networks was so large,
the difference between the bottom network and the top
network was so large that again I thought an average
figure would have misstated and misrepresented volume.

                                                        79
          So in this instance on children's programming,
again by design, what you've heard, I made a very strong
pitch that it not be counted towards this figure, and if
it's not counted, since the survey was designed to try to
collect data that we could measure, I had very little
information, and I can't really provide much counsel or
guidance, as I cannot really about the news, because the
news was not counted towards these figures.
          So other than reporting what they did for
segments, the scope of this survey does not cover the
kinds of questions and concerns being raised about local
news.
          MS. WHITE:  Just as a suggestion, you might want
to change Kid Vid.
          MR. McINTURFF:  I apologize, and would be happy
to do that.
          MR. LACAMERA:  Let me just conclude with our
roundtable here.  Gigi.
          MS. SOHN:  I want to just express some concerns,
and then I do have a question.
          This advisory committee has talked a lot about
the lack of community programming and local programming,
programming that promotes democracy and self-governance,
and before today I read your entire 30-page paper, and the
pages and pages are devoted to PSA's and local weather

                                                        80
disasters.  You've got about three paragraphs on community
programming, and programming that meets the needs of local
communities.
          Not ER, not Seinfeld discussing AIDS.  Stuff
like race relations, civic governance, taxes, local
education, and frankly, regardless of whatever weeks, and
there's a dispute of whether these are sweeps weeks or 
not, the Benton survey proposes there is still 25, one-
quarter of the stations in those markets that did nothing,
zero, zilch.
          Here are my concerns with your survey.  First of
all, the vast majority of your values are public service
announcements, which are rarely devoted to any discussion
of issues.  They are usually feel-good issues.  Everybody
wants to stop AIDS.  Everybody wants to buckle up.  But
actually discussing local issues, they don't do that.
          Second, the other major part of your evaluation
are charitable contributions and efforts that any good
corporate citizen would undertake.
          My third concern, frankly, is who did not
respond, and I have the same concern with Bob's survey. 
Bob did not survey the religious stations, the home
shopping stations.
          What I want to know is what differentiates good
broadcasters from bad broadcasters, and I think your

                                                        81
answer was very self-serving.
          My fourth concern is the valuation.  Charles
asked you two times and you did not answer.  How much of
the PSA time is unsold time?  To me the valuation of that
time is zero, and so I think your valuation is a little
bit overstated.
          With that, let me ask my question.  I have the
council's numbers in front of me, and according to the Ad
Council, broadcast TV spent -- donated $129.6 million to
the Ad Council for PSA's, and so my question is this. 
Well, I have two questions.
          First is, where is the rest of the money coming
from, and you obviously have a much bigger number, so
where is the rest of the money going to, if not to the Ad
Council for PSA's, and the second is, don't you find it
curious that according to the Ad Council cable television
is donating over $50 million more in PSA's than the
broadcast industry is?
          MR. McINTURFF:  I can't deal with the cable
issue.  That's again beyond the scope of this research,
and so you're asking specifically for the four networks,
where we provided a figure of $148 million versus the 129
for the Ad Council?
          MS. SOHN:  Well, you surveyed all the stations,
right?  You surveyed all the stations, and I'm assuming --

                                                        82
and you can correct me.  Maybe I'm not reading this
correctly, that the Ad Council money is just network
money, or is it all stations?  It's all stations, right? 
So there's a huge discrepancy.  Where's the rest of the
money?
          MR. GOODMAN:  The rest of the money is -- there
are any number of PSA's that are not the Ad Council's. 
The Ad Council's a particular group of national PSA's.
          For example, that does not include spots for the
Partnership for a Drug-Free America.  That does not
include any local PSA's.  It does not include the value of
the PSA's that we distribute, and as I said, we do that
every month for stations.
          Any number of -- my understanding is, it
accounts for only about 10 percent of PSA's Nation-wide,
but what you've seen is there the figure goes up.
          The other thing I will tell you from looking at
Ad Council figures, particularly for the networks, over
several years they tend to go up and down for a variety of
reasons, depending on what the Ad Council is doing,
depending on what the networks are doing.  There's a
variation.
          But there are a huge number of PSA's that are
not from the Ad Council, but what is indicative is the Ad
Council has once again had a very strong year, after a

                                                        83
series of strong years of increased donations from
broadcasters to their campaigns.
          MS. CHARREN:  Somebody said, and I can't
remember who, a little while ago that there was never a
ruling about the value of local programming in terms of
the obligation of stations.
          I seem to remember that there was once a ruling
in 1960-ish that 5 percent of the programming had to be
news and public affairs and 5 percent had to be local, and
it could be the same 5 percent, but local programming was
definitely one of the mandates relating to broadcaster
responsibility, and so it set up the local programming as
an essential part of public service.
          Now, maybe that is not in effect today, with all
of the deregulation, but to say that it was never a part
of what broadcasters had to think about I think is wrong.
          MR. GOODMAN:  Peggy, I'm not sure what you're
referring to.  If it's the 1960 policy statement, that was
actually never adopted by the commission, but to the
extent the commission used to have quantitative
guidelines -- and it did, and there were a whole variety
of things.
          I mean, the 1960 policy statement would require,
for example, a station in New York City to do agricultural
programming.  There was a one-size-fits-all, and the

                                                        84
commission, beginning in the late seventies, decided that
with the growth of the number of stations, of TV and radio
stations, that that was not an effective regulatory model,
and it was better to have stations doing a variety of
things.
          MS. CHARREN:  I'm not saying it exists now, but
I'm saying local was an idea that was perceived to be a
separate kind of public service.
          MR. GOODMAN:  But one of the things about the
survey that the Benton Foundation did is that it only
looks at one aspect of local, and that's one of the things
I think that Bill Duhamel said, is they do local public
affairs as part of a local news block.
          And, as we've seen, we've gone from local half-
hour news to often, I think some of the stations in the
Belo report indicated 8 hours day of news programming,
which includes long segments that are public affairs that
are not tracked if you simply look at a TV schedule but
are, in fact, local, and are in fact public affairs, and
are, in fact, dealing with local issues.
          So to simply say there is not a regularly
scheduled half-hour program in a particular period does
not begin to tell you whether or not those issues were
addressed.
          MS. CHARREN:  That's not what I'm talking about

                                                        85
at all, the debate whether that study is any good.  I'm
just saying the statement that local was never a part of
an idea of public service is not right.
          MR. GOODMAN:  We don't disagree, but locally
produced public affairs programming, the very narrow
definition that has been asserted here today has never
been the central mandate, as it has been suggested.
          The commission has always said there are a
variety of ways in which stations can meet their
obligations, and it has certainly said so for the last 25
years and, indeed, I think the figures on news programming
show that the commission's expectations that stations
would do this on their own and a variety of ways of
service would come about have proven true.
          MR. BENTON:  The community ascertainment of
procedures, which were relatively recently abolished, were
all focused on the ascertainment of local community means
in relation to serving the public interest needs and
necessity so that local was the center point.  That is why
we made the comments, because the whole ascertainment
procedure was around meeting local needs, not national
needs but local needs, and that is broadcasting's unique
power.
          It is the license to serve the needs of the
local community, unlike cable television, unlike national

                                                        86
television, unlike many of your competitors, and so really
in a positive sense this is one of the great potentials
for additional service and leadership at the local level
that NAB as an organization through its members can do
much more about, perhaps.
          MR. GOODMAN:  I think one of the things you see
in this survey, for example, in the fund-raising, one of
the things that I think comes out most clearly out of the
survey is the fact of how central local broadcasters are
to their communities.
          When things happen in a community, people turn
to their local broadcasters.  If there's a disaster, they
turn to the local broadcaster, if they go -- if there is a
race for the cure, if there's a health fair, if there's
something else, and what broadcasters have done, as Bill
said, is leveraged the value of their spectrum for their
communities.
          And so the fund-raising is very central.  It is
not merely good works.  It is using the license in a way
to benefit the community through those good works, so it
is not simply just general.  It is the particular use of
the airways and their central role, the way the people
feel about broadcasters in their community, to improve the
community.
          MS. SOHN:  Local programming was -- programming

                                                        87
specifically targeted to serve their community needs was
indeed the hallmark of receiving a broadcast license, and
Jack is right, in the deregulation in the eighties the FCC
had said that you don't have to do it through programming,
and I think that's one thing -- I mean, you can do it
through PSA's, and you can do it through charity fund-
raising drives.
          I think one of the things this advisory
committee needs to think about is, is that something -- is
that something we want to keep?  I mean, is that good
enough?  I think that's what we're saying.  It is not
worth it to argue whether it is or it was or it could be
or it should be.  The point is, we have an opportunity. 
Do we want to do something about it.
          MR. LACAMERA:  Frank, you've been waiting.
          MR. BLYTHE:  I just wanted to follow up on
several things.  First of all, I just wonder what the
distribution of this report is that the NAB plans, and for
whatever other purposes they were going to use the report,
and secondly, the survey seems to indicate that stations
do keep a vast amount of records of their local public
affairs and PSA programming.
          I was wondering if you found those files readily
available to respond to this survey, and if those files
are in public access files, or public files available to

                                                        88
the public, for instance, for your license renewal
purposes and review, and that kind of thing.
          MR. McINTURFF:  Let me answer the second
question, because it's a short answer, which is I don't
know, because the survey -- because we mailed it.  They
filled it out and they returned it, so I don't know the
data base they used for the information, and again, maybe
Jack knows.
          I'm not prepared enough to know what is and is
not public record, so I just don't have enough information
to answer that question in terms of how people use it.
          Again, my portion of the job will be done, and
is done, and I will have to let NAB answer the question of
how it's going to be distributed or used.
          MR. GOODMAN:  To answer those two questions,
obviously the report has been released to the public.  It
was announced at our convention last week.  It is
available.  It is apparently available now on the Internet
so people can look at it.
          As Bill mentioned, we also prepared individual
State surveys, and State associations have them, and Bill,
did we do them for every State?
          MR. McINTURFF:  Almost every State.
          MR. GOODMAN:  The State associations have them,
and many of them have communicated to Congressmen and

                                                        89
other governmental leaders in their States showing what
the broadcasters in their State are doing to serve their
communities, so I think it will be used for a variety of
purposes.
          In terms of what's available, although the FCC
no longer requires a detailed program log as it did when
it did have quantitative guidelines, most stations do keep
logs of some sort because they need it to ensure -- for
example, to prove to advertisers that they have met their
commitments to them, and for a variety of other station
management purposes.  Those are typically not in the
public file.
          There is, of course, as the materials that Gigi
has prepared indicated, a requirement of the quarterly
issues programs list that every station puts in their
public file, and there is a variety of other information
in the public file.
          MR. LACAMERA:  Karen.
          MS. STRAUSS:  My question goes to Charles.  Your
survey indicates approximately 66 percent of the stations
have consulted with community leaders.
          First of all, how do you define consultation,
and what has the consultation done in the course of
ordinary news programs?
          Secondly, who initiated the consultations?  Were

                                                        90
they initiated by the stations, or were they initiated by
the community leaders?
          And third of all, how do you define community
leader?
          MR. McINTURFF:  I look at all of these
questions' wording again, which I have provided, but all
of that would fall into the category of goodwill and kind
of common usage and meaning, and so I have no information
at all.
          MR. CRUMP:  May I break in at this point and
tell you, in the Twin Cities, presently in every market
I've been in on a quarterly basis we invite the political
leaders, leaders of the charities, the leaders of minority
groups, all that we can find to come and speak with us.
          We have meetings that last -- usually they are
preset so that they last all day long, and what we do is
allow anyone that wants to speak, that wants to make a
presentation and wants to tell us about problems and wants
to look for solutions to come in and meet with us.
          This is then written up, passed around to all
the department heads at the various stations, and in the
Twin Cities all of the stations get together to do this so
that the individuals involved won't have to spend all of
their time going from one station to the other.
          MS. STRAUSS:  Well, Howard, your answer was

                                                        91
exactly what I was hoping to hear but without an answer
like that I don't see how we can give much credence to a
66 percentile, because if stations don't know what they
were responding to, then how do we know they responded to
that which was actually ascertainable?
          MR. McINTURFF:  It's like any other question
asked on a survey.  If you ask people who run TV stations
what's meant by community leader, we have no trouble --
they have no trouble answering, or any station manager in
the country answering what they mean by community leader. 
These are not difficult terms for people in this industry
to fill out in a survey.
          MS. STRAUSS:  But what I'm saying is that,
according to this, or according to what you just told me,
stations may have just decided to put down yes simply
because they may consult community leaders at various
times during the year on various issues related to news
programs.
          What you're saying, Harold, is there's a
specific effort made in the Twin Cities to reach out and
contact and receive the input from these community
leaders.  Those are two very different things, and it
depends upon what the perception was of the survey
respondents in answering this question.
          MR. CRUMP:  Obviously, I don't know the

                                                        92
situation in all the markets, but I can tell you in a very
large number of markets that I am personally familiar with
this is what takes place.  This is the norm.  I don't know
that that is 100 percent.
          MR. SUNSTEIN:  My social science friends would
be very upset with me if I didn't ask the following
question.  I've done some kind of seat-of-the-pants math,
and you have a 42-percent response rate with four public
service announcements, 20-percent refusal rate with
respect to charitable contribution, 42 percent refusal
rate with respect to free air time, 55 percent refusal
rate -- 
          MR. McINTURFF:  These are really not refusals. 
We put them in a summary category, but a lot of them were
not qualified to respond.
          MR. SUNSTEIN:  The only thing that matters is,
you didn't get the numbers, so let me ask the question, if
I can.  The reason for the nonresponse is irrelevant to
what I'm about to say.
          That means if my numbers, and these are rough
numbers, work, is that you have 30 percent of your
population came up with numbers for PSA's, 25 percent of
your people came up with numbers for charity, and 18
percent, only 18 percent for free air time for candidates.
          Now, that is not a criticism of them at all.  It

                                                        93
is just a social science point about your extrapolation. 
Unless my math is wrong, and the likelihood of my math
being wrong is over 50 percent, but unless my math is
wrong you've extrapolated from these extremely tiny
percentages, 18, 25, and 30 respectively, to the full
population.
          That is, you treated the 42 percent as
representative, and of that 42 percent you treated these
small subgroups as representative.
          Now, that would -- in the social sciences I
think would, say, raise questions of bias, not personal
bias, just statistical bias, because if you get the 42
percent who responded are likely, off-hand, speculatively,
to be the sort of people who respond to surveys, and that
is going to skew the sample, and they're going to be the
sort of people who respond to this survey, and that would
skew the sample.
          Now, that would be serious enough, but if you're
extrapolating from subpercentages of the people who
respond to this question the people who came up with their
own numbers, that would accentuate the bias.
          So my social science friends would say the data
itself, the raw data is extremely interesting.  The
extrapolation is not that interesting.
          MR. McINTURFF:  Two comments.  One, there's no

                                                        94
question that again this is what happens with mail
questionnaires, which is, people can choose to respond or
not respond to individual questions.
          When we got to the point where we asked the
exact dollar figure of candidate time offered we had some
follow-up, but I will say just in terms of the percents,
you need to understand you're looking at percents of the
total base, out of 4,000, and so if only half -- and
follow me here.  If only half said they did it, that's
roughly 2,000 out of 4,000, and so when you see a number
like 30 percent gave us a number, it's 30 of 50 percent,
so I provided numbers on a total basis.  So in other
words, if half don't do it, they're not here in the
numbers, because I never extrapolated for that.
          And you have to remember something else.  We did
the dollar volume.  We took the stations who did not
respond and said half of them didn't do this, and so the
projections were based -- and let's just take this exact
point in time.  Let me walk you through some rough
numbers.
          Let's say there were 8,000 -- and I'm going to
make this -- let's do this easy.  There's 8,000 people who
got this survey.  4,000 of them responded.  Of the 4,000
that responded, 2,000 said they've offered free time.  Of
those 2,000 that offered free time, 60 percent of those

                                                        95
people gave us a dollar figure for what the time was
worth.
          Now, that 60 percent is 30 percent of the total
base, but it's two-thirds, almost two-thirds of the
eligible qualified people who could answer that question.
          Then we took a dollar figure for those people
who gave us a dollar figure.
          Now, to project it, what did we do for the
projection?  There's 4,000 people that didn't answer the
survey.  The first thing we did was to say, there's 2,000
people that didn't do it, because only half did, so
there's 2,000 left.  Of those 2,000 left, two-thirds of
them provided a dollar figure, so we should do the
projection on those that we know about.
          MR. SUNSTEIN:  That's the problem.  The bias is
what you know about.
          MR. McINTURFF:  No.  I'm trying to tell you we
took the same percent that we know about to project to the
people we don't know about.
          MR. SUNSTEIN:  That's the problem.  It is not a
perfect world.  It's terribly unreliable. If you took 100
people and ask them how much time they spent on charity,
and 42 answered, and half of those people gave you a
number, and then extrapolated to the population of 100, do
you think that would be publishable in a peer review

                                                        96
journal?
          MR. LACAMERA:  It's not 42 percent.  We don't
have an interest in our group here in radio.  It's 63
percent.
          MR. SUNSTEIN:  The same problem.
          MR. LACAMERA:  Less of a problem.
          MR. McINTURFF:  The answer is yes, if you knew,
as we do here, that in a controlled environment that we
are sampling the same people, so it's not 100 people at
random, these are not random people, they all share an
enormous amount of characteristics in common, and so do I
feel comfortable doing this?  Yes. I think it is
defensible.
          And number 2, when you look at percents you have
to remember we're talking about a data base of 4,000
respondents, so when you're looking at how large the
number we use to project from, if 2,000 people did it and
two-thirds provided a number, you're talking about 1,200
respondents, and do I feel comfortable saying 1,200
respondents can give you an accurate figure for the net
value of time, political time?  Yes.
          We're not talking about we took 800 people, or
an 800 national sample and started cutting up.  We're
talking about 1,200 respondents for each of those figures.
          MR. SUNSTEIN:  Let me say I think it's wonderful

                                                        97
you've done this.  You've provided a lot more information
than we had before, so basically it's terrific.
          The extrapolation, if you can get this published
based on the extrapolation in a peer review journal, send
me a note.
          MR. CRUZ:  Did you take into account Spanish
language broadcasting?
          MR. McINTURFF:  Yes.
          MR. CRUZ:  Both of the networks?
          MR. GOODMAN:  No, it was not the networks, just
the stations.  The networks on a national level, it was
only the four major networks that were surveyed.
          MR. CRUZ:  You didn't take Univision or
Telemundo into account?
          MR. GOODMAN:  Except to the extent their
affiliates answered.
          MR. CRUZ:  Do you happen to know how many of
those did?
          MR. GOODMAN:  No.
          MR. CRUZ:  A small number or a big number, a
ball park figure?
          MR. McINTURFF:  We have nothing that would ask
the language of the station, in terms of the language they
broadcast in, so I can't possibly answer the question.
          MR. CRUMP:  As we discuss the amount of money

                                                        98
here, because that is one of the main topics, obviously we
are proud of it as broadcasters, other than saying, gee,
maybe it should have been more, I'm a little confused.
          I would like to ask a specific question, because
I have a reason for doing so, and I will admit that.  Am I
correct in the fact that you said moneys that would be
raised as a result of a story that was in a newscast were
not counted?
          MR. McINTURFF:  If it is a news story.
          MR. CRUMP:  In other words, let's say it's a
news story about a tragedy like a flood, and at the end of
this we say, and if you want to send funds you can do it
to the Salvation Army, here's a Red Cross number today,
here's an association of churches, this is where you can
send money, was that counted in this the way you
specifically asked the question, or not?
          MR. McINTURFF:  The way the question was worded
I can't give you a stable answer across 4,000 respondents
to know whether they did or did not in an individual
station count that, given the way it was asked, which is,
did you help -- and let me find the wording.
          The wording is, if there was a direct appeal on
the air, as the news segment, to send money to this place,
I cannot tell you for sure whether a station would or
would not have counted that, given the question.

                                                        99
          MR. CRUMP:  The reason I asked the question is
because one of the slides that you showed was the
simulcast that was done in the Twin Cities as a result of
the big flood that unfortunately occurred up in Northern
Minnesota and North Dakota, and the $200,000 we were all
quite appalled at, quite honestly, as a small number when
we got through with the simulcast.
          But then we began to realize that all of the
stations in the news stories that we had been running for
weeks had been tagged with where you can send money, and
we started counting up that, and we realized there were
literally hundreds of thousands of dollars that had been
raised previously by churches, by the Red Cross, by the
Salvation Army as a direct result of our appeals that were
tagged into news stories that had nothing to do with the
$200,000.  It was much more than that to begin with. 
That's the reason I ask the question.
          MR. McINTURFF:  Again, using the exact language
of the question there, which is, in the past year did your
station  help charities, charitable causes, or any
individuals by fund-raising, please list how much money
was collected or pledged in your fund-raising efforts over
the past year -- 
          MR. CRUMP:  Well, you see, we didn't keep up
with that.  We went and asked after this.  We had not kept

                                                       100
up with it because it was not what we call a fund-raising
drive like a telethon, or something of that sort.
          MR. McINTURFF:  But it leaves open the question,
I'm just saying, that it's very possible that respondents
could have answered differently using the news segment
example you just gave.
          MR. CRUMP:  Thank you.
          MR. MINOW:  I want to make a big picture
comment, not in regard to the details of the study.  The
fact that the NAB did this study is what's important.  If
this commission prompted it, it's equally important.
          The problem is, and Gigi raised it, is that the
questionnaire went out only to NAB members -- only to NAB
members.  The best broadcasters, the NAB members.  The
worst broadcasters were not even in the study, and they're
not here.
          MR. McINTURFF:  I can respond to that.
          MR. MINOW:  Let me finish.  When I was -- long
before many of you were born we gave great credence in the
FCC to the fact that, if you belong to the NAB, if you
subscribe to the NAB code.  We encouraged it.  In fact, I
proposed that we make membership in the NAB mandatory,
which I think would have been a good idea.
          Our own Government was complicit in knocking out
the code, which was a foolish thing to do, but I think the

                                                       101
NAB ought to be encouraged, and I think what Gigi said,
why do the good guys cover up for the bad guys, that's the
thing that is the problem.
          The worst broadcasters do not believe in the NAB
principles or code, and it seems to me that those of you
who are in the NAB ought to be after the bad guys.
          MR. McINTURFF:  Let me answer the simple survey
point that was made.  This survey was mailed -- and we
know how many commercial licensees there are from the FCC. 
We're missing about 200 TV stations and 2,000 radio
stations.
          This survey was mailed in States to State and
NAB members and in some States they added just any
commercial broadcaster.  That was done in about four or
five States that I know about.
          The point is, this represents about 85 or 90
percent of every station out there.  It is my view,
anecdotally and through those interviews, that the radio
stations -- NAB and the States clearly wanted to have --
you know, they wanted bigger numbers, not smaller numbers. 
If people were excluded, it wasn't because they represent
huge market interests.
          And so again, the other thing I will say, as a
researcher, in terms of the gentleman's point, there are
2,000 radio stations, 200 TV stations that were not

                                                       102
surveyed.  We refused, as a researcher, to apply any of
these numbers to those stations because I felt that that's
exactly the point at which we would know nothing about the
survey universe, and my entire inclination from talking to
people around the country is, these are like, 19-watt
little stations, et cetera, et cetera.
          So I think that you should view, and accurately
view this report as representing the functional economic
interests of the entire broadcast community.
          MS. STRAUSS:  Can I make one more comment?  A
panelist discussion would not be complete unless I raise
closed captioning.
          Of course, you didn't ask whether these items
were closed captioned in your survey, because there were
no laws in effect and so therefore there was very little
voluntary effort to caption, so probably -- and wait, let
me finish, Jack -- probably a very small percentage of the
PSA's and the public affairs programming was captioned. 
There may have been some percentage, but probably it was
very small.
          And this is more of a comment than a question. 
What people here may not be aware of is that even with the
new captioning rules a significant amount of this is
exempt.
          Specifically, political advertisements are

                                                       103
exempt, public service announcements, unless they're
federally funded, programs without any -- local
programming that does not have a repeat value are exempt,
all foreign language programming, so if there are Spanish-
speaking people that want access to the public affairs
programming, that's exempt.
          And additional exemptions have been sought by
the Association of Local Television Stations for local
programming with limited repeat value, which would include
public affairs programming, and they also sought an
exemption for all political candidate debates.
          And I am just putting this out on the table,
because once again the deaf community has not even got an
access to the percentages shown in this survey.
          MR. GOODMAN:  The first thing is, of course,
there is another survey which the NAB did, a much shorter
survey, a mail survey, that was done in connection with
the FCC comments on closed captioning rules which
indicated that about 80 percent of stations voluntarily
captioned their local news, so that most local news
programming is captioned around the country, so far as we
can tell.
          There are certain exemptions in the FCC's rules,
and that matter is currently under reconsideration, but
basically the FCC responding to Congress has said that the

                                                       104
vast majority of programming that is not now captioned
must be captioned within the next few years.
          In fact, of course, for most broadcast
programming all broadcast entertainment programming that
is new is captioned, and that has been done voluntarily. 
That was done before the Congress stepped in.
          Now, I would just like to add one point, because
I saw some material that was distributed to the committee
this week that I believe is inaccurate, which shows that
there will be no captioning requirement for digital
television.  This is simply untrue.
          MS. STRAUSS:  You can stop there.  I know that. 
That material was not distributed by me, and you're
absolutely right, there is definitely a captioning
requirement for the digital television.
          But let me just respond very quickly.  Yes,
you're right, entertainment programming will be captioned,
but that is not what we are talking about here.
          And you know what my response is going to be on
local news, and that is that all that is required is
electronic news room reporting, which is using the
teleprompter to caption what's on the news.  The
teleprompter does not cover any live coverage.  Therefore,
deaf and hard-of-hearing viewers are denied access to all
late-breaking reports, updated weather information,

                                                       105
updated sports information.  Real time captioning is not
required.
          MR. GOODMAN:  That is true, and it was done for
several good reasons.
          One is that the Congress indicated to the FCC
that they should balance interest here, that they should
not require captioning if it would result in a loss of
programming, and the commission concluded the cost of real
time captioning and the lack of availability of real time
captioners across the entire United States would, as a
result -- if they impose that requirement would mean a
loss of programming, so the commission balanced that
interest.
          The other thing the commission concluded was
that there were likely to be new technologies coming,
voice recognition technologies, improved E&R captioning,
for example.
          As stations digitize the news rooms they have
much better teleprompter, much better ability to use
electronic newsroom captioning, and all of those were
factors that the FCC decided would be appropriate, that
they take a transition period, particularly given, at
least for broadcasters, the very high level of captioning
that already existed before there were any rules.
          MS. STRAUSS:  There were those who disagreed

                                                       106
with the FCC's conclusion on real time captioning, but
again what you're talking about is local news, and what
this survey is covering are other things such as PSA's and
public affairs programming and political candidate
advertisements and political candidate debates, and again,
when you go through that laundry list you can see that
virtually all of it is exempt from the captioning
requirements.
          MR. GOODMAN:  Karen, just to follow up, I don't
know the answer on PSA's because they're done by various
people and in various ways, and it is much like
advertising.  It's very difficult to say.
          As you know, one of the reasons the FCC decided
not to require captioning of political candidate
advertisements was the provision in the Communications Act
that bars stations from censoring them, so they could see
no enforcement mechanism, because a station could not
project the ad if it came in uncaptioned, nor did they
want stations to be in the position of adding captions to
candidates' speech where the candidates have not done so
themselves.
          MS. STRAUSS:  I don't want to get into a long
debate on this, but obviously we disagree with that point.
          MR. MOONVES:  Paul, we should wrap up within
about 10 minutes.  I know my co-chair has an overall, but

                                                       107
if there's anybody else who wants to jump in first -- 
          MR. LACAMERA:  Any final questions from the
committee?
          Bill, Jack, thanks very much.
          MR. ORNSTEIN:  Wait just a minute, Paul.  I have
a few questions I want to raise.
          Let me start first by echoing what many of the
members said.  It's terrific that the NAB did this, and I
would also say, having known Bill for many years, it's
terrific that they used a first-class surveyor like Bill
McInturff.  I understand all the problems you had in
pulling something like this together.
          I do think that this probably wouldn't pass
muster in a peer-reviewed social science journal, but it
provides a lot of very useful information for us, and I
encourage the NAB to do this on a regular basis because,
if we can't take the dollar figures or the amounts at
complete face value, determining over time in a
comparative way using a comparable methodology the
behavior of broadcasters is going to be very, very useful
to see how much change there is as we measure up to, or
don't, standards that have been set.
          I have a couple of smaller and more specific
points, and then I want to raise a larger issue that has
been raised here as well and ask a question of you, Bill,

                                                       108
for something that you might do for us, or the NAB might
help us with.
          A couple of specific questions, the figures you
have, getting to the political time, of stations offering
debate time and then turning it down, which you have
quantified, do you have any sense of whether stations, if
one of the candidates, say a Jesse Helms, turn down that
opportunity, offered the time to other candidates or used
it for other debate purposes, or just said all right, he's
turned that down, let's just forget about it?
          MR. McINTURFF:  Again, in the scope of this
survey I don't have any way of answering that question and
we have been, for good or for ill, trying to keep to real
data bases, so I don't want to provide my own anecdotal
evidence on that question.
          MR. LACAMERA:  There are ramifications to that
action, and we have all thought about it, and correct me
if I'm wrong, but because of equal time, if you in turn
then conduct the debate without one of the candidates, or
offer that time to the candidate who does accept, then the
candidate who refuses still has a right to come back under
equal time and you have to give that person that half-
hour, or that hour, whatever, of uncensored time to do
with whatever he or she might want.
          MR. ORNSTEIN:  My understanding, Paul -- and I

                                                       109
may be wrong here as well -- that defining that as a bona
fide news event, if you invite candidates to appear at a
bona fide news event and a candidate chooses not to
appear, and you conduct a debate, then there isn't an
equal time issue.
          MR. GOODMAN:  Actually, Norm, the commission has
said that if there's only one candidate it is not a
debate.  So in other words, if you have multiple
candidates, and one turns you down, but if, for example,
under the Helms-Gantt race, where you have really only two
candidates, if only one shows up the FCC has ruled that is
not a debate, so it would not constitute a bona fide news
event, and you would have a full equal time problem.
          MR. ORNSTEIN:  The question I was asking was not
simply whether, if Helms turned it down you gave the time
to Gantt, but rather, if Helms turned it down, whether the
station said, well, we will offer a debate, the same time
to congressional candidates, or local candidates.
          In other words, whether this is charged off and
then nothing's done, or whether it's -- 
          MR. GOODMAN:  If I can answer for Jim, if I
recall in North Carolina the stations did do a
gubernatorial debate, and they have done other State-wide
debates.
          MR. ORNSTEIN:  But in this survey the

                                                       110
gubernatorial debate would count as time, and the time
offer to Helms and Gantt that was turned down would count
as additional time.
          MR. McINTURFF:  They would be bifurcated by the
results.  They would be counted in the different figure.
          MR. ORNSTEIN:  The second question, getting back
to what is really a very large issue that Jim Goodmon has
raised and others have raised as well, what this survey
does for me is, it confirms that something that I have
known for a very long time, that we have an awful lot of
broadcasters out there doing an enormous amount of
innovative and good work across a whole range of areas
benefiting community service, but there is a range, and
obviously you have demonstrated some elements of that
range.
          As Jim suggested, a part of our dilemma is, what
do we do about those who do nothing, or next to nothing,
and we have to determine -- and also we have to determine
as we move, because our focus is not what broadcasters are
doing now, but what broadcasters will do in the digital
age, what the changes in technology, the changes in the
marketplace will mean in terms of public service, with
opportunities provided not necessarily by adding
regulatory mandates but in other ways, to enhance these
elements that we all consider public service.

                                                       111
          So this survey I hope can be useful for us in a
host of ways.
          I would ask you one other small question and
then turn it into a larger one.  As you looked -- and it
gets to a more specific point about the range here.
          Looking at all of these markets, give me a
couple of examples if you can, Bill, of areas which you
would consider to be the Nirvana or the Garden of Eden of
the voluntary provision of community service activities by
broadcasters, and a couple of areas that your survey would
suggest are the Gobi Desert of public service activity,
where the viewers generally are not getting much and
others where they're getting just a cornucopia of things. 
Can you give me some examples?
          MR. McINTURFF:  A perfectly reasonable question. 
I'm trying to think about the data base and the way it's
constructed, because again, when I talked to you about how
valuable this time has been for me, listening to this
commission and others, I do think NAB has expressed that
they would like to continue to do this.  I hope it is at a
much higher cost than what I agreed to do it for the first
time.
          MR. ORNSTEIN:  I think we can commit to that
right here.
          (Laughter.)

                                                       112
          MR. McINTURFF:  And three, I think a lot of
concerns have been raised with things we can try to
address, but I will tell you again as a researcher I was a
little surprised, because we had not a lot of information
about these 11,000 different stations out there, and so
the kind of segmentation you would normally do by size of
station, revenue, and some other stuff was all the stuff
we have had to approximate as best we could just by size
of market.
          And so what that means is, within-State we don't
have accounting codes.  We don't have exact media market. 
We have size of market, but not exact markets, and we have
physical location.
          So what it means is, as opposed to pushing a
commuter button where you can say, just lay this data into
534 DMA's and have that done on a computer, you have to do
it by hand, and I think as a follow-up it is possible that
Bob Klopack and his firm, who did a lot of the personal
interviews, might have a feel for that, that we could then
try to quantify.
          But I don't have -- if this were a national
survey sample, where you coded telephone exchanges, where
you could assign people into DMA's and push a button, we
could do that and have computers do that.  The data base
is frustrating and difficult to work with, because that's

                                                       113
the kind of thing I cannot easily do.
          MR. ORNSTEIN:  Here's what I would ask you to
do, and I would ask the NAB if they could help in this
regard to help us along the way.  We have come, I think,
very close to a consensus here, and I hope I am right,
that we are going to try and come back to a code, a code
of conduct, and this survey is extremely useful in that
regard.
          What I would ask you to do is the following. 
What I would like to find is a norm here, some kind of
norm of what is out there, which your survey suggests we
can come up with an average, or a standard, and clearly
it's going to be different for different-sized
marketplaces.
          What I would like you to do, because you've
given some good examples here, is to go back and give us a
picture of what 1) an exemplary large market, medium
market, small market broadcaster does in these areas, with
specific examples, the kinds of PSA's, the kinds of
political activities, the kinds of charitable activities,
the average broadcaster in each of these markets -- you
can pick an example.
          Pick from one of the interviews you did, or one
of the surveys in markets that would generally fit that
kind of standard, and the poor, one that may be two

                                                       114
standard deviations from the norm.
          And if we can have that kind of information it
would help us enormously as we try to work towards what
kind of -- and I'm not talking about necessarily
establishing mandates here, but rather what kind of
standards we want to set that we hope broadcasters will
measure up to.
          Is that something that is doable, and could we
get some assistance in that regard?
          MR. McINTURFF:  I think in terms of our
responsibilities that should be duly noted as a request,
and that's something the NAB and you and the commission
should talk about at a later date.
          MR. ORNSTEIN:  What about you, Jack?  Do you
think we can do that?
          MR. GOODMAN:  I don't know, Norm, whether the
data supports it.  I'm not the person who could tell you
that.
          But I think one of the difficulties with that
is, as we've pointed out, I think at some length, Bill
did, this survey only measures a few things.  It measures
things we could quantify readily across a huge station
population.  It leaves out all kinds of things, and
therefore I think to say these are the three things that
every station must do suggests a regulatory mandate, or

                                                       115
regulatory climate that is inappropriate.
          There are a lot of things stations do. 
Christian stations do things for their communities that
are not reflected, perhaps, in any of these things, that
are not reflected in a showing of news, but are valuable
to the public.
          And I think that is the difficulty with coming
up with quantitative guidelines, is not only are they
inherently First Amendment-sensitive, but they also
suggest there is one model, and I think what the
commission has done, and quite wisely for the last 25
years, is to say there are a lot of models.  There are a
lot of different things, a lot of things that are valuable
to the public.
          For example, I know one of the things that has
been criticized for years is home shopping stations, and
yet if you ask people who watch them, they consider them
extremely valuable, and they may be a small number of
people, but there are a lot of things that the public
values, and I think it is very difficult to say, one, two,
three, four, this is it, because I think what you get is a
very rigid formulation.
          Instead of people saying, how can I serve the
public, how can I do things that -- in a market where
there are eight other stations, be different from

                                                       116
everybody else.
          MR. ORNSTEIN:  Jack, we didn't pick the three
things.  You picked the three things, and I have no
quarrel with the methodology that Bill used and the reason
that he did it, and I'm certainly not suggesting that we
would refuse to take into account all of those other
vagaries, or the other things.
          All I am asking is that we get a refinement of
this survey that the NAB has done pointing out these
quantitative areas that gives us some assistance as we
move forward.  It is not asking you to do something more
than what you have done, but giving us some more flavor
and a fuller picture so we can begin to look at what kinds
of conduct we want to support.
          MR. BENTON:  If I can reinforce this wonderful
point of our co-chairs, because all he's really asking,
Jack, is NAB's help in developing some measurement and
standards of performance of public service and public
interest obligations, and I think that is really right on.
          And going to the article that you passed us
out -- and I just want to read a quote from this article
that puts kind of a blessing on this point that Norm has
so beautifully articulated, Fritz is quoted in here as --
he cites the report's assessment of the local efforts of
radio and television stations, quotes, we have always said

                                                       117
that localism is that which separates us.  It is our
franchise, and it is ours alone, and that's, by the way,
why we did this report.
          MR. ORNSTEIN:  Well, I would like a slightly
better answer if I could, Jack.  I mean, all I'm asking
is -- and I recognize it will take some resources.  I want
to take this broad survey, which picked areas that you
guys for good reasons chose, but certainly not the only
ones, and I fully recognize that there are stations that
have very different mandates, but given that you've
defined some of the voluntary activities in the community
service this way, I want to get a fuller sense of the
range of what's there so, as we move forward with our
process, we can use that.
          MR. GOODMAN:  Norm, I'm not in a position to
give you an answer on the question, but I continue to
suggest that it is a goal that leads you in inappropriate
directions, because it leads you -- these numbers were
picked, as I think Bill explained in detail, because these
were things that could be quantified.  It was never the
intent, it was never the suggestion that this is the be-
all and end-all, that this is a full description.  This is
simply, here's a baseline, these are things we know.
          And it goes to your point.  You say that you can
take some of these numbers at face value.  Well, even if

                                                       118
you discount them substantially, instead of 6.8 billion
it's 5 billion, it's a staggering number, but the point
is, it's one of a number of different things, and merely
to say, well, these are the three things, because we can
quantify them and we're going to make them the sine qua
non of everything a station does is I think -- which I
think is the direction you're looking at, a code that says
you must do this, is problematical.
          Certainly I'm not in a position to give you a
definitive answer, and I don't know what the data allows,
but we will take that into account.  But I think there is
a problem with the direction you're going.
          MR. LACAMERA:  I'm sure those stations who
participated at the highest level would welcome being
singled out.  I think asking NAB to identify some stations
that willingly participated in this and then designating
them as poor performers I think would cause problems with
NAB and its relations with its members.
          MR. ORNSTEIN:  I don't need to have a particular
station singled out in this case necessarily as a goat,
but hide the identification.  Say Station X in a market of
this size, and here's what they're doing, just so we have
a sense in a more specific way of the range of the kinds
of activities here, recognizing there are lots of other
activities.

                                                       119
          MR. DECHERD:  Norm, I'm pleased that a number of
members of the commission have given NAB credit for doing
this survey, and I think the methodology is sound,
recognizing the constraints, recognizing the fact that
this is unprecedented.  I mean, it is a start, and so
forth and so on, but there's a larger dimension here that
worries me as I think about where we're headed overall in
our deliberations.
          First, I would second what Paul said.  It is
just not realistic to ask the National Association of
Broadcasters to single out a market in terms of the survey
approach, which everyone wants to figure out -- I mean, we
will all know what the market is.  It will be generally
discussed -- and say these people don't perform well, and
then every member of the NAB within that market is tagged
with being the underperformer, and so let's set that aside
and agree, as I think all broadcasters can and have, that
there are broadcasters who don't do the things we're
talking about today.  I don't think that's productive.
          I do think it is productive to hold up as an
example what some of the -- let's call them average or
very good broadcasters do, and that's been done.  It's
been done a hundred times, and what worries me is the
skepticism that keeps emerging among members of this
commission that we are somehow aberrational.

                                                       120
          I've heard three times this morning that we're
defending the bad guys.  I'm not here to defend anybody. 
I'm just asking you to look at what's being done
constructively and assume, just for purposes of one
discussion, that this might actually be representative of
what most broadcasters do and, if you make that
assumption, what we come down to is a question of
ideology.
          Do we let the marketplace operate and assume
that they are good people doing good things with good
faith in the majority of cases, or are we going to find
all the bad guys and make them do what we want them to do,
and I cannot and will not sign up for that, and no good
broadcaster should.
          MR. ORNSTEIN:  Let me just respond.  I'm
perfectly happy to avoid singling out those that do
nothing.  I was just trying to get the widest range.
          If what we could get would be a fuller
picture -- and I accept the notion that the overwhelming
majority of broadcasters fit in this category, that it is
a bell-shaped curve and the vast majority are in an area
doing an awful lot, a picture of the kinds of things with
more specifics of what are done by the average person
given this survey, and those who are very much at the end
of being, you know, terrific, that's fine with me.

                                                       121
          We don't need to go to the other end if it's
going to offend anybody or if it's going to end up being
misused.  That's fine.  I don't have a problem with that. 
That wasn't my intent.
          MR. MASUR:  I think, if I'm understanding Norman
clearly, it would be because what we have here is this
very macro view, which some of us feel will not be
entirely useful in helping us to do what we need to do.
          If we can get some of the specific examples
blind that Norman is talking about, it would give us some
guidance.  I don't think Norman is saying for a second
that these would be three things that would be the sine
qua non.  It is just to get a feel for, you've developed
some information that to our knowledge has not been
developed before in this way, just to get a more usable
feel for that information.
          MR. GOODMAN:  I guess I would respond by saying
the purpose, as I understand this committee, is to look at
whether broadcasters are serving the public interest and
whether, if something changed about digital broadcasting
that would require a change in the way the public interest
standard has been interpreted in the last -- well,
currently, and I would say that what the study shows is
that broadcasting overall is serving the public interest.
          I mean, this is one of three measures.  There

                                                       122
are lots more.  The qualitative study that is discussed in
the report and that Bill had some references to were ways
of finding things other than the three measures that we
had in the quantitative survey and, of course, there are
lots more beyond that.
          And so the issue is not really whether any
particular station does any particular thing.  It is
whether the industry as a whole, whether the public
interest standard is alive, well, and working, and so I
don't think this question of, is there a variance
somewhere, is particularly productive to that question
unless the purpose of this committee is simply to say
we're going to single out people and punish them for
whatever sin they might be committing.
          So I think the survey speaks for itself.  It
shows the tremendous amount of programming in three
particular areas that are done, the quantitative survey
shows a tremendous amount of other stuff being done, and
whether there is any particular station -- and there
clearly, as Bill indicated, has been a range from top to
bottom of dollar amounts and minutes and anything, and
that can vary.  We did it for 1 year.  It can vary from
year-to-year, market-to-market.
          So the point is, with respect to the issue that
as I understand is the committee's purpose, it seems to me

                                                       123
this goes directly to it much more than whether we single
out any particular station or say one station did more
than another.  That's certainly true, and it was the
expectation of the FCC 20 years ago when it decided that
one-size-fits-all regulation was inappropriate.
          MR. MOONVES:  Peggy, a last question or comment,
and then Paul, back to you.
          MS. CHARREN:  Since it's getting late and we've
spent a lot of time on this, I'm willing to save my
statement about the role of rules in a democratic society
to create level playing fields until later.
          MR. MOONVES:  We're going to spend the afternoon
after lunch deliberating.
          MS. CHARREN:  But it really is a direct answer
to the last set of statements.
          MR. MOONVES:  If you would like to -- 
          MS. CHARREN:  I don't want to do it now, but I
would like to reserve the right to do it.
          MR. MOONVES:  You can be our first speaker after
lunch.  You have my word.
          MR. McINTURFF:  I do have some additional
information to provide to the committee, and I do some
work for clients and the public.  We release some stuff to
the press, and one reporter said to me, how come I
never -- I don't hear really, really bad news.  When I see

                                                       124
you it's kind of good for your client.
          And I said, I have an observation from Carl
Sagan.  Sagan wrote this very interesting article on
porpoise intelligence and the unique bond people have with
porpoises, and he uses as evidence all the reports of all
the lives that have been saved by porpoises by pushing
stranded swimmers to shore, and this scientist wrote back
who studied porpoises and said, what Carl Sagan failed to
notice, porpoises like to push things with their nose. 
It's an enjoyable activity.  You never hear from the
swimmers they push out to sea.
          And so that is my last way of saying that I
think this is positive information for this industry, but
it is because of the reality base that there is a lot of
good going on by broadcasters.
          MR. MOONVES:  I want to thank you and Jack.
          MR. LACAMERA:  Gentlemen, thanks very much. 
It's been an interesting morning and hopefully in some
ways an encouraging one.
          I would share first of all the compliments for
the National Association of Broadcasters for undertaking
this initiative and the findings in those very three
specific areas.  I join some other members of this
committee and would love to know more about the public
affairs activities, the classic public affairs activities

                                                       125
of local stations in this country, and perhaps in the next
iteration we can learn more about that, but what you've
given us is a very solid base for our continued
deliberations, and we thank you all very much.
          MR. MOONVES:  We will break till 1:30, when
Peggy Charren will be the first speaker.
          (Whereupon, at 12:20 p.m., the meeting
recessed.)

[View the transcript of the afternoon session]