How to secure Software Defined Radios
Book page 1 Become a Lifeguard Guard our Airwaves YOU can save lives
From: John Gilmore 9/14/02 [
1
]
This 121-page report on "How to secure Software Defined Radios",
written to help the FCC decide how to handle software radios, is
very slanted toward monopoly-industry viewpoints.
The whole focus is on giving the system operator lots of
flexibility to do whatever they want, while giving customers,
experimenters, competitors, and citizens zero flexibility or
opportunity.
They managed to suppress the few pages of actual information about
the paucity of any actual threat to public safety (see last
paragraph of this review).
The document reminds me of those eco-terror reports about how the
world is falling to shit (whether it really is or not). On the
page numbered 14 at the bottom (page 21 in the PDF -- I wish Adobe
could get this straight), it says:
The impending widespread use of software changes, whether to add or improve user services or to reconfigure RF parameters of a wireless device, presents substantial new challenges to manufacturers and operators, particularly in the face of a youthful "digital generation"...
Having a whole generation of young people grow up full of knowledge about engineering, computers, and networking is a wonderful advance in the state of humanity. The authors of the paper apparently see it as a drawback, since they depend on their customers being ignorant.
The report pays little attention to the huge opportunities offered
by separating the development of hardware from the development of
software. If you look at the history of computers, the people who
built great hardware tended to suck at software. The people who
built great communications hardware built the worst networking
software.
IBM mainframes are the most obvious example in computers. Not only
the PC, but the 1970's minicomputer and the 1980's engineering
workstation were great examples of how open hardware could allow
software innovation to flourish.
Welded-shut systems tend to be terrible, compared to the
innovation and cost reduction available in systems where one
vendor supplies hardware, another supplies more plug-in hardware,
and four or five more supply various bits of software (the OS, the
applications, custom scripting, etc). The Bell System telephone is
the obvious example of a welded-shut system that it took a 1950's
FCC decision to open (the Carterfone decision), unleashing vendors
to supply *modems* which permitted *data* traffic which permitted
*computer-mediated communication* and eventual *networking*,
eventually creating the public *internet* about forty years later.
The US got a head start in creating the Internet because of the
Carterfone decision and subsequent decisions on access to leased
lines, while almost every other country's telephones were still
run by a monopoly PTT whose interest was in preventing competing
forms of communication.
Today's cellphones and Cable TV systems are similarly welded shut,
resulting in endless lock-in that profits the vendors while
beggaring the society; they provide only the minimal amount of
innovation that keeps the vendors alive. (As a small example,
there's still no decent mobile data networking; the one company
that temporarily provided it, Metricom, grew out of the ham radio
fraternity rather than the cellphone companies.)
The report's focus on "Wireless Threats" is obsessive about
threats to their revenue models or to their "systems" or their
control. For example, a software modification that would permit
cellphones to talk directly to each other when in range of each
other, without the use of a base station or its network, would be
considered a threat to the integrity of the system ("unauthorized
access to services"). However, it would be considered a positive
feature by end users, who would be happy to pay third parties to
write such software; it would serve more total users by reusing
the spectrum locally; etc.
A really useful SDR communication device would have a jack that
would take either an Ethernet or a phone line; if neither was
plugged in, it would look for 802.11 local connectivity, or
Metricom wide-area connectivity, or its own networking protocol if
any similar station was in range, or failing that, would be able
to use a terrible and expensive cellular data or voice network.
All of this would be transparent to the user. None of the vendors
who authored this report would ever build such a device, because
the majority of the time it would not force users to pay them by
the minute. And user or competitor attempts to reprogram existing
devices to do this, or any part of it, would be warded off as
"attacks" by "malicious hackers".
The report is followed by the individual company reports that were
submitted to it. They make interesting if duplicative reading,
since they reveal that the whole SDR Forum report just seems to be
a stitching-together of the most self-serving parts of the
documents submitted by various companies.
Wow! There's even an Appendix H (PDF page 100) that is a verbatim
copy of a
Digital Restrictions Management report from the Copy
Protection Technical Working Group
, which talks about how "Securing adequate protection for
copyrighted works in the digital environment will allow
development of viable business models. Viable business models will
in turn help drive adoption of broadband ... and expanded consumer
choices ..."
Of course, it's full of horses--t;
DRM
is all about preventing UN-VIABLE monopoly business models from
going extinct when they have been obsoleted by technology. This
is even the report that talks about how the "broadcast flag"
will save digital broadcasts that happen "in the clear"
. It's particularly incongruous in a report full of glowing
public-key crypto recommendations.
PDF page 102 is a great overview of some companies that deserve
cypherpunk scrutiny. E.g. "Signum Technology" provides iPak that
lets you print packaging labels with "sophisticated invisible
watermarking that allows incorporation of hidden identifying data"
which can be revealed "in a matter of seconds" with "an
inexpensive scanning device".
The report in general also follows the "academic paper" model of
security: See, we're using public key algorithms. Therefore our
systems will be secure. The impact of bugs, protocol design
failures, social engineering, breach of trust, undersized keys,
revelation or government appropriation of private keys, etc, are
all ignored.
radio network satellite securing wireless networks radio phone service
On page 8 (PDF 15) it mischaracterizes the problems with 802.11
and WEP. First, it leads off with Adam Stubblefield's break of WEP
-- failing to start with the 30-year history of
trivial-to-crack wireless network protocols that were built
under the guidance of the FCC and with active help and legal
threats by the National Security Agency
. WEP uses 40-bit keys because they were the largest available in
exportable products until EFF's lawsuit cracked the
unconstitutional export controls. Securing wireless networks has
always been a second priority to making it trivial for the
government to illegally wiretap them. This tension isn't going to
go away.
Second, its 802.11 discussion directly lies that the popular
practice of recreational listening for open wireless networks
results from the ability to crack WEP-secured networks -- rather
than the public's tendency to leave 802.11 networks unsecured
because the industry and the vendors only provided painful
hand-configured ways to secure them. I know of no "wardriving"
that seeks and cracks WEP-secured nets; it's all merely probes for
networks that people have left open, either by default or by
intent. (The idea that someone would intentionally PERMIT the
nearby public to freely use their wireless network infrastructure
is apparently heresy to the authors of the report.)
The report also looks approvingly on
digital-restrictions-management systems (DRM) as the solution.
E.g. no SDR will be able to run code unless it has been digitally
certified by the vendor. Like every other restrictions-management
system, this will be deliberately used to cement established
monopolies and prevent innovation. It even rates part of the
"threat model" consequences as "Digital Rights Violation" --
defined not as a violation of the user's rights or privacy, but as
"unauthorized access to, or theft of, digital content and
software".
The report's focus is also inappropriately largely focused on "
cellphones
". It reminds me of a Motorola emp who told me in the early '90s
"car phone" era that offering wireless data networking would be
irresponsible because "you should keep your eyes on the road",
implying that (1) only people in cars would want wireless data
networking, and (2) they would use it while driving. This tunnel
vision mindset doesn't enable the authors to notice that
everything from car alarms to shipping crates to pets to ballpoint
pens to automotive light bulbs already today, or soon will, come
with data networking built in. Software-defined radios will be
broadly useful throughout society, not just for "cellphones" but
for ****everything****. So if the "SDR Forum" or the FCC denies
society the benefits of rapid innovation, they won't just be
denying it to cellphone users, but to auto owners, package
shippers, pet owners, doctors, writers, lightbulb users, and
everyone else.
Page 23 (PDF 30)
jocularly reports that "Eavesdropping on user data (Breach of
security, public safety has some experience in this scenario)". I
think they meant that they have some experience being intercepted,
but of course "public safety" agencies systematically intercept
private citizen communications. The report goes on to suggest
that:
Sophisticated encryption techniques should be made available to
public safety users of SDR technology. AES voice encryption and
128-bit data encryption should be considered minimum standards for
public safety SDR devices. Devices can be lost or stolen and,
therefore, must be capable of remote revocation of service.
One would hate for the *citizens* to get access to AES voice
encryption or 128-bit data encryption, therefore we had better
only give those devices to cops, and "remotely revoke" them if
they leak outside the Government Trust Barrier to ordinary
untrustworthy voters or citizens.
Page 25 (PDF 32)
lauds the "
Trusted Computing Platform Alliance
", Intel's fXXk-the-customers-for-Hollywood initiative, as being a
model for SDR companies to follow. As usual, they completely
obscure the critical question, which is "Who trusts who?". Their
"Trusted" systems are trusted by monopolists to not be susceptible
to unauthorized competition. This word game deceives people who
foolishly think they're trying to build "systems the consumer can
trust".
Page 29 (PDF 36)
even pushes the "NTRUEncrypt" snake oil encryption system.
Page 30 and 31 (PDF 37 and 38)
discuss GSM security, without bothering to mention that they kept
their snake oil algorithm secret for years so that consumers would
not find out how insecure it was. It bleats that the "Security
Group has realized two important initiatives over the past 12
months": introducing AS/3 to replace the faulty algorithms, and a
protocol for "authenticated key agreement", AKA. Until I hear
differently, I'll assume that these are both more proprietary
snake oil.
On page 34 (PDF 41)
their prime example of how "public safety" users need priority and
reliability is "best illustrated by the SWAT team commander
notifying the sniper with the 'shoot' or 'don't shoot' command".
Given the number of SWAT teams deployed against innocent civilian
drug users, such as the raid on terminal cancer patients made in
Santa Cruz last week by just such a machine-gun-toting SWAT team,
this is an insulting example. Cops need good communication to know
whether they've been ordered by a corrupt higher official to shoot
a citizen from concealment. I see.
Page 40 (47)
goes so far astray as to say:
"As a multitude of products and services still uses proprietary
solutions there is no advantage to using secure standards which
only give extra security if everyone else offers them."
The next page points up the monopoly control problem in wireless
cellular networks as a positive feature:
5.2.2 Asymmetry of Information
Unlike the general IT situation where there is varying levels of
control over client devices attached to a network; from closely
controlled private networks, to no control of devices connected to
the internet; commercial handsets must be qualified to operate on
an operators network before they are allowed to connect. Therefore
operators have complete control over the capabilities of devices
they allow to connect to their network. As such there is no
asymmetry of information in the case of handsets deployed on an
operators network. The report then concludes without saying much
more than these terrible things.
But then come a bunch of interesting submissions from various
companies.
Intel's reveals the source of the TCPA paragraphs (copied
directly from Intel propaganda)
. It again skips over the REASON why 802.11 is insecure, and fails
to mention the real cure (standardized mass-market end-to-end
encryption, which is politically disfavored since it discourages
wiretapping).
The paper from the "Mobile Virtual Center of Excellence" in the UK
at least makes explicit the authoritarian model that's implicit
throughout the rest of the document (PDF page 64):
Domains and regulatory bodies. We suppose that the world is
divided into an number of administrative domains, which may
correspond to single nations (e.g. the USA), or groups of nations
(e.g. the EU). In some cases, it may also be the case that nations
are sub-divided into separate domains. Each domain is assumed to
have a single regulatory body, responsible for deciding which
software is permitted to be downloaded and executed in SDR
platforms.
Which software citizens will be permitted to download or
execute. What a fascist concept.
The MVCE proceeds on page PDF 71
to say, "Also, issues relating to
Digital Rights Management (DRM)
may arise, i.e. where the SV restricts use of code modules to
enforce payment for these modules."
Motorola's submission (PDF pg 74)
seemingly ignores
Kevin Mitnick's
penetration of North Carolina cell site software, well documented
in several books, when it says:
The data links which connect the OMC to the cell sites are private
data networks controlled by the network operators, and offer no
entry point for remote hackers. Furthermore, there is no internet
or other publicly accessible external data connection into the
OMC. The inherent security of base station equipment is
demonstrated by the fact that second generation (2G) commercial
base stations are remotely programmable, and have been operating
in high volume for over ten years without any significant security
issues. The focus of this report, therefore, will be on security
issues surrounding commercial handsets.
Sorry, Kev, you were insignificant. The Moto document also has a
5-page slideshow tutorial on encryption, for the braindead who
have nevertheless managed to read that far. (PDF 77)
The Moto document is the source of the "
Security Threat Model
"
that included that "Digital Rights Violation" example above, and
"Example Threat Scenarios" that are almost uniformly "hacker",
"black market", "unethical", or "disreputable" parties modifying
the system. (In one example, an "inadvertent" bug does not bring
the repute or ethics of the creator into question, presumably
since Motorola would be the originator of the bug.) There's no
example of beneficial improvements made by third parties; of
experimentation by scientists or university students; of
innovation by competitors. The system is designed to block all
those things.
Motorola's document surprisingly honestly says that security
systems should be designed to work even when the entire design is
known to opponents (page PDF 83). But then on page PDF 89 it
argues for snake oil, which has served the wireless industry so
well in the past:
Finally, it should be stressed that the very nature of the
security challenge is that the "threat" is ever changing.
Malicious hackers make it their business to try and decode the
security systems designed to thwart their unscrupulous efforts.
Therefore, regulatory mandate of specific security methods would
be counterproductive. To do so would provide a blue print for the
malicious hacker, and would impede the industry's responsiveness
to an ever-changing security landscape.
Clearly, anyone who might want to put software of their own
choice onto the equipment they have purchased is "malicious" and
"unscrupulous" rather than "an honest competitor" or "a
customer".
And we can't have the laws merely say what is required, or that
would provide a "blue print" for people who want to do something
else.
The only really sane part of the Motorola document is the page and
a half at PDF 86, which says how tiny the
"SDR security threat" really is
. Basically it says that people who design mobile radio gear are
operating in a very tight design space and aren't going to put in
a whole lot of expensive flexibility that would allow operation in
multiple frequency bands, massive increases in output power, etc.
I.e. the whole inquiry is basically a sham. This sensible part of
the language never made it into the final report of the SDR Forum,
though. Typical.
John Gilmore