|
A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs |
![](http://www.groklaw.net/images/speck.gif) |
Sunday, July 04 2004 @ 07:16 AM EDT
|
A Short Note on Secure Operating Systems.
~ by Victor Yodaiken, CEO, FSMLabs
Remarks attributed to Gene Spafford and Cynthia Irvine in EE-Times
and a marketing offensive by Green Hills against Linux don't provide an
accurate picture of software security issues for operating systems
and, in fact, add to the confusion. In what follows, I want to try to
move the discussion to a less emotional and more balanced basis. One
of my points is that security certifications have serious limitations
and costs. That's not to say that certifications are bad or useless --
far from it -- but certification is not a cure-all or without problems,
and people need to be able to distinguish between marketing and actual
engineering.
1. Professor Spafford's complaint about the "provenance" of code in
Linux's open development model is unfounded. There is no assurance
that any software development effort is free from people who have
bad intent or who just write lousy software. It's not even clear
that it is easier to get code into Linux than it is to get code
into other operating systems. In fact, because Linux code is
developed on an open model and is tracked by a comprehensive
source control system (see www.bkbits.com ) it may be relatively
harder to smuggle malicious code into Linux. In any case,
"provenance" is a side issue, one that is easily turned into cheap
fear-mongering and xenophobia
( http://www.ghs.com/news/20040426_linux.html). For the same
reason, I very much dislike the terminology "subversive code"
which is an emotionally charged substitute for the standard term
"malicious code". The real issues are whether software is designed
and tested for security and whether the development process
assures certain levels of quality.
2. The state of the art should not be overestimated. When, according
to EE-Times, Professor Irvine contrasts Linux with
"'high-assurance' operating systems with the smarts to prove that
subverting code doesn't exist" we should understand that there are
no, none, zero, existing operating systems that can prove that
they don't contain malicious code or other security flaws. The
Common Criteria is a "best practices" standard developed by agencies
of several governments (in the US, the NSA and NIAP) but the
standard is relatively new and untested. It seems as if it should
produce more secure software, but we don't know for sure. Experts
like Bruce Schneier argue that standards themselves are of limited
use. When Linux is
criticized for not having Common Criteria certifications, we
should note that there are no operating systems certified at the
highest level of the Common Criteria and not even any widely
accepted Common Criteria specifications for the security of
embedded operating systems. In fact, there is considerable
disagreement among researchers over best methods, a shortage of
empirical data, and limits to what can be verified at the highest
level.
3. The Common Criteria standard defines "evaluation assurance levels"
(EAL) from 1 to 7, with 7 being the highest. These levels are
being used in a grossly misleading manner to try to sell software.
There are no existing EAL7 certified operating systems - not a
single one (except maybe hidden in some NSA lab).Furthermore, the
EAL is just a measure of the level of effort and rigor put into
proving that a software program satisfies a security
specification. If the specification does not correctly identify
the actual threats to the system, the software is not secure, no
matter what the level of evaluation assurance. A rigorous proof
that the Titanic is unsinkable in the Caribbean Sea is no comfort
on a voyage through the North Atlantic iceberg belt. Flooding
attacks that can cause a real-time operating system to have
scheduling delays are among the threats we consider in our design
and test process at FSMLabs. Green Hills is currently embarked on a
EAL6 (not 7) effort against a specification that does not have a
single real-time requirement. Proof that an operating system meets
a specification which does not address the flooding threat
provides no actual security assurance if the threat is possible.
One of the virtues of the Common Criteria is that it repeatedly
warns against claims like "my software is more secure than yours".
The Common Criteria framework encourages people to define "more
secure" against the actual type and role of the software and a
clear threat analysis. It would be unfortunate indeed if the
fundamental message of the Common Criteria was ignored, and it was
used only as a tool for frightening people into purchasing or not
purchasing software based on meaningless buzzwords.
(See http://eros.cs.jhu.edu/~shap/NT-EAL4.html for some similar comments and a very different point of view.)
4. Even the higher EALs are not ironclad, but they are very costly.
EAL7 only requires a "semi-formal" specification of the low level
implementation. Formal specification of complex software is not
solved problem.
At the top, EAL7, level there are significant limitations on the
practicability of meeting the requirements, partly due to substantial
cost impact on the developer and evaluator activities, and also
because anything other than the simplest of products is likely to be
too complex to submit to current state-of-the-art techniques for
formal analysis.
(Cf: http://niap.nist.gov/cc-scheme/cc_docs/cc_introduction.pdf)
5. Despite Professor Spafford's complaints about the intrusion of
mere cost considerations into software purchase decisions, in the
real-world resources are limited and tradeoffs are inescapable.
Developers will limit functionality in an effort to limit the
costs of certification or just to make certification practical. If
a limited certified operating system causes the complexity of
applications to increase and the reliability of those applications
to decrease, use of that software may have a negative effect on
the security of the whole system. Is it more or less dangerous to
use Linux to control a power plant than it is to use an EAL5 (say)
OS? Suppose the EAL5 OS comes with no device drivers, costs enough
to reduce the amount of test time that can be used in development
or the number of trained operators used to monitor the plant, and
requires application developers to produce their own math library.
Suppose the Linux system is not connected to the network. Suppose
the EAL5 evaluation is against a specification that does not cover
the most likely threats. The answer is: you'd better do a real whole
systems security analysis instead of relying on buzzwords.
6. Finally, I want to point out that the mere existence of a
"certified" version of some company's operating system does not
mean anything about the other software produced by that company.
For both Common Criteria and the FAA's DO-178 reliability
certification, the general practice is to set up a separate
development team and a separate, limited, product line. Most
projects, even the military projects that Green Hills CEO Dan
O'Dowd cites, are unable to bear the costs and/or the limitations
of the certified product lines and so purchase the standard
commercial versions which receive PR benefits, but not necessarily
any reliability or security benefits, from the certified product
lines. The interesting comparison between Linux or some other
solution is to the actual products being used, not the highest
assurance component sold by the vendor.
Software security requires strong engineering and solid cost/benefit
analysis, even though that is probably not the best marketing tactic
and it means we have to admit that there are no magic bullets to make
the problem go away.
©Victor Yodaiken, FSMLabs
|
|
Authored by: MathFox on Sunday, July 04 2004 @ 07:41 AM EDT |
To find them faster
---
When people start to comment on the form of the message, it is a sign that they
have problems to accept the truth of the message.
[ Reply to This | # ]
|
|
Authored by: Trithemius on Sunday, July 04 2004 @ 08:07 AM EDT |
Excellent bit of reading, thanks for sending it along.
Matters of security
are one of the last places you want vendors to be playing "buzzword
bingo". I read the notorious Green Hills CEO's remarks a while back, and was
the least bit amused by them. His attacks on Linux and open source development
methodologies seemed to have a tone one would have patting an unruly child on
the head and telling them to go play, while the adults to the real work -
dismissive.
Simple rule of thumb: Never underestimate the capabilities large
numbers of highly intelligent, highly motivated people. While this is far from
a panacea, it is not something to be dismissed lightly or with meaningless
buzzwords and invalid comparisons. [ Reply to This | # ]
|
|
Authored by: moonbroth on Sunday, July 04 2004 @ 08:34 AM EDT |
At times like this, I point people to Why Information
Security is Hard - an economic perspective [PDF], by Ross Anderson, Professor of Security
Engineering at Cambridge University. He makes some good points about the
shortcomings of the Common Criteria (etc.), within a wider discussion of
problems inherent in buying 'secure' products from a vendor who will not suffer
nearly as much as you do if their vaunted 'security' fails.
To put it
another way (more crudely than Professor Anderson would) -- wouldn't it be nice
if monopolists, shills and apologists suffered actual economic harm every time a
new critical vulnerability was exposed? But they don't, do they? [ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 08:42 AM EDT |
Another point that needs to be made is that some types of attack are much easier
depending on the underlying hardware. During the 1970s several manufacturers
introduced hardware that could effectively kill any attempt at buffer overflows.
Lamlaw mentions some of the capabilities of HP's HPPE environment and anyone who
has studied ICL's VME hardware will know of others.
All of these were
abandoned with the move to cheap microprocessor based systems as the chips were
originally designed for environments that did not need such protection and
no-one thought to demand it as they grew up. Hence the fuss about the NX bit
on the AMD 64. [ Reply to This | # ]
|
|
Authored by: Totosplatz on Sunday, July 04 2004 @ 09:08 AM EDT |
This is a community acustomed to discussion of matters related to a set of
legal cases and the issues related to those cases.
Your points may be
good, might even be excellent.
Where is P.J.? --- All the best
to one and all. [ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 09:09 AM EDT |
Spafford specifically said he did not want to single out Linux. All he said was
that some of the code in Linux was of unknown origin, which is a fair criticism.
Lets not sick the PR attack dogs on Spaf, a long time and highly respected
member of the UNIX community.
-Dwight
[ Reply to This | # ]
|
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: Anonymous on Sunday, July 04 2004 @ 09:37 AM EDT
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: blacklight on Sunday, July 04 2004 @ 09:44 AM EDT
- Origins - Authored by: cricketjeff on Sunday, July 04 2004 @ 09:48 AM EDT
- ..Microsoft Windows can _easily_ qualify at EAL7, it paints the screen blue. ;-) N/T - Authored by: Anonymous on Sunday, July 04 2004 @ 10:06 AM EDT
- ARF! Arrf, arf! - Authored by: Anonymous on Sunday, July 04 2004 @ 10:19 AM EDT
- Grr! Arf!Arf! Rrrrrrrr! - Authored by: Tim Ransom on Sunday, July 04 2004 @ 10:34 AM EDT
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: Peter H. Salus on Sunday, July 04 2004 @ 10:37 AM EDT
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: Anonymous on Sunday, July 04 2004 @ 10:47 AM EDT
- Criticism != Turning on someone - Authored by: EdisonRex on Sunday, July 04 2004 @ 11:12 AM EDT
- Spaf has been at it for years - Authored by: Tim Ransom on Sunday, July 04 2004 @ 11:58 AM EDT
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: Anonymous on Sunday, July 04 2004 @ 12:16 PM EDT
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: darkonc on Sunday, July 04 2004 @ 02:01 PM EDT
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: nanook on Sunday, July 04 2004 @ 02:08 PM EDT
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: darthaggie on Sunday, July 04 2004 @ 11:15 AM EDT
- known origin != secure - Authored by: rsmith on Sunday, July 04 2004 @ 12:43 PM EDT
- A nit to pick - Authored by: inode_buddha on Sunday, July 04 2004 @ 01:24 PM EDT
- All he said? - Authored by: Eric on Sunday, July 04 2004 @ 01:39 PM EDT
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: philc on Sunday, July 04 2004 @ 07:53 PM EDT
- Here's another gem from the great and wonderful Spaf: - Authored by: Tim Ransom on Monday, July 05 2004 @ 12:49 AM EDT
- Open message to groklaw - Authored by: Anonymous on Monday, July 05 2004 @ 01:01 AM EDT
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: Anonymous on Monday, July 05 2004 @ 12:00 PM EDT
- A Short Note on Secure Operating Systems ~ by Victor Yodaiken, CEO, FSMLabs - Authored by: blacklight on Monday, July 05 2004 @ 09:19 PM EDT
|
Authored by: Anonymous on Sunday, July 04 2004 @ 09:32 AM EDT |
I would have thought that if they wanted REALLY secure systems and were REALLY
worried about the code, it would be inspected character by character, regardless
of the effort required to do so. In a country with the population of the USA,
surely there are enough coders with the required security clearences to do
this?
But then...who will check their work....and who will check the checkers...ad
finitum!
ANON...Sorry, lost my password :-)[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 10:30 AM EDT |
I think Victor's note makes some excellent points. Further to this thread,
let me point out that the issue of secure operating systems has a very long
history and a relatively modest (not to say poor) record of success. The
kernelized secure operating system (KSOS) DARPA project of the late 70's was
an attempt to make a Un*x-like OS that was completely verified using formal
methods from the source code up. The project even invented a new secure
programming language (Euclid) and a formal verification protocol (EVES) for
the purpose.
The point I want to make is that even though people have been working
(hard) on this problem ever since then, there are some fundamentally difficult
problems involved for which no practical solution has yet been found. These
include such simple things as the fact that operating system source code is
compiled, and therefore we need to verify the compiler to at least the same
level of security (remember Dennis Ritchie's famous Turing award lecture?),
and similarly for the library, the BIOS, the microcode, and the hardware
itself.
The most difficult problems have to do with verification technology; even if
we agree on a totally correct specification, the best anyone has ever been
able to do is to verify quite small pieces of code, even after all these years
of effort. And some of those mathematical verifications have themselves
been shown to be in error.
The upshot of all this experience is that realistically the only practical
thing one can do is raise levels of confidence in the correctness or
security level of a code base as large as an operating system kernel,
not prove it or certify it. And certainly it is much easier to raise levels
of confidence by having hundreds of thousands of people reading (and
yes, trying to find holes in) the code than by hiding it in a proprietary lab.
- Jim Cordy, former chief programmer, Toronto Euclid project.[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 10:34 AM EDT |
I'm happy to report that Mr. Yodaiken's letter is also printed in the letters
section of the June 28, 2004 edition of Electronic Engineering Times. For some
reason, they did not see fit to identify him, but I suspect that most people
following embedded O/S developments know who he is.
Some publications ignore criticism of their reporting. EET has shown that they
are willing to print a rebuttal to statements made in a previous article. I
suspect that the editors will be taking a closer look at future articles on the
embedded software market.
EET still has a ways to go, though. The current issue has a front page article
on how Microsoft and Wind River have had to change their strategies in response
to Linux. I'm glad to see the coverage, but the article is a bit confused and
includes quotes from discredited sources such as Rob Enderle. It looks to me as
if the reporter hasn't yet found a quotable source who understands free and open
source software.
Trade journals live by sifting through press releases from competing companies,
separating the wheat from the chaff. Personal relationships with key people in
the industry help the reporters understand which products are significant, where
the budgets are being spent, and who to watch. Free software is hard to cover,
because so much of it occurs outside of the usual company marketing efforts.
Linux uses a different development model than many are used to, it forces a
different business model, and most reporters try to shoehorn it into the models
that they are used to. It doesn't fit.
FSMLabs competes with Microsoft and Green Hills. Reporters understand that.
Windows is Microsoft, and Microsoft is Windows; you can't separate the product
from the company. Reporters understand the identification of a product with a
company, that the company success depends on the product's success. FSMLabs may
be Linux, but Linux is not just FSMLabs. Reporters have a very hard time with
that concept. The idea that Linux can succeed regardless of any single company's
success or failure baffles most business reporters. That's what confuses Dan
Lyons, for example.
So let's give credit to, and provide help to, those reporters who are trying to
understand something outside of their previous experience. They won't get it
right all of the time, or even most of the time. Eventually they'll begin to
understand. Let's reserve our disdain for the reporters who relish their
tunnel-vision, the ones who refuse to learn because they already know it all.
-- hc
[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 10:48 AM EDT |
and any one of its items could harbor a security vulnerability or attack.
Just looking at the kernel or any one SW component is misguided.
Security is built in from the start and is multi-level and presents layered
defenses. No commercial OS is truly securable. For example, any HW
driver could harbour an attack, and HW drivers are third party SW.
It was discovered in the early 90's that any truly secure OS would break
practically all commercial applications and also create huge barriers to
file based access. Only then did security experts yield to the current state
of security affairs; only because it is impractical not impossible.
Today, marketers and experts sound ignorant based on what has already
been tested and proven. Posix is securable but no one would accept it.
On the other hand, Windows is insecure by intent and design. Just look
at ActiveX.
Why would anyone accept spyware, adware, viri, trojan SW, worms, zombis,
and other malware as part of modern computing?[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 10:52 AM EDT |
People tend to gloss over it a bit, so it needs to be pointed out in glaring
flashing neon or something, and people need to be whacked over the head with it,
until they get it....
Linux (including all open source, GPL, GNU, etc) are subject to the largest
amount of peer-review on the planet. Every person on the net has access to view
the source code, to review it, and to submit problems, flaws, etc.
Code is subject to review by practically millions of people every day, the most
crucial code (eg: the linux kernel) is subject to the toughest review of all.
You never know when some comp-sci student in a moment of boredom, curiousity (or
otherwise) will decide to review that bit of code someone just submitted.
In fact this is how problems are found, and also how they are fixed so quickly,
because the reviewers mostly submit fixes or patches along with the problem
submission.
But even the casual user can submit "bug reports", like "hey this
program is doing xyz and I dont think it should be doing that..." or
whatever. Sure they may not be able to submit a fix, but once a problem is
identified it gets fixed very quickly.
the point being, it is practically impossible to hide malicious code from so
many prying eyes, in fact I'd dare say it would be completely impossible to hide
such code for any length of time. Its very difficult to hide something when you
have so many people reviewing constantly.
Analogy - you say its difficult to type in your password when someone is looking
over your shoulder... well, what would you say if there were a million people
crowded around you watching your every move? Could you type it in then and be
sure no one saw it? I certainly couldn't... I wouldn't even try![ Reply to This | # ]
|
|
Authored by: wap3 on Sunday, July 04 2004 @ 11:02 AM EDT |
Here's a few link to my thoughts on the whole mess..... SCO, MS, IP,
politics/election/congress-critters........etc
idocy
stupidity
delusions
mistakes
--wap3 [ Reply to This | # ]
|
|
Authored by: Thomas Frayne on Sunday, July 04 2004 @ 11:42 AM EDT |
Post URLs and OT here.
LQWiki court
timeline.
[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 01:00 PM EDT |
Well, I suppose it's always nice to see rational arguments. Kind
of reminds you that humans CAN thank rationally now and then.
But if you're a politician, this isn't how you win! You're not
dealing with a rational judge in a courtroom here, people!
You're dealing with Congress!
You do this with buzz words, PR, advertising, whatever you want
to call it. It started when John Kennedy looked better on TV
than Richard Nixon, and extended to George Bush's weapons
of mass destruction. There are a whole lot more popcorn eaters
than there are rational thinkers, and their votes count the same. (In
fact, think about it: Can you think rationally and eat popcorn at the
same time?)
Who's going to be more effective against Bush: Arthur Schlesinger
or Michael Moore? Who's going to be more effective against Kerry:
William F. Buckley or Ann Coulter?
Anyway, just look at a short list of political books and movies
on Amazon.com:
Boys on the Bus
The Selling of the President
The Buying of the President
The Marketing of the President
The Candidate
Fahrenheit 9/11
Now, the two big questions that remain:
Can Schwarzenegger deliver California to Bush?
Can we get Hunter Thompson to do a book on Darl McBride?
[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 01:41 PM EDT |
that Green Hills has taken up such an anti-linux FUD campaign. They actually
have some pretty good tools and products. My company has just completed a trade
study that included Linux and Integrity (among others). We chose Integrity
because it was the best fit for our application.
I'm sure that Linux is taking some of the embedded market share away from Green
Hills (and Wind River), but they still have a strong position in that market.
They have better tools and better support than the linux competition. I can't
see that changing anytime soon.
[ Reply to This | # ]
|
|
Authored by: futureweaver on Sunday, July 04 2004 @ 02:58 PM EDT |
Long ago I did some work on formal verification of an embedded realtime
(preemptive and timeslice, message passing between processes, the whole thing
fitted in 1k) OS, and hard work it was. I managed to prove a few useful
assertions about properties that were expected to be preserved by process
switches, and similar such. The main difficulty, as I recall, was that small
changes to the code could invalidate the proofs, even in the assertions were
still true, so you had to do them again. The maths wasn't especially
complicated, just very tedious.[ Reply to This | # ]
|
|
Authored by: Sunny Penguin on Sunday, July 04 2004 @ 03:25 PM EDT |
Anyone who says Linux is insecure these days is an idiot.
Here is a link to
"Secret" Linux
code
--- Litigation is no sustituite for Innovation.
Say No to SCO.
IMHO IANAL IAALINUXGEEK [ Reply to This | # ]
|
|
Authored by: cricketjeff on Sunday, July 04 2004 @ 03:37 PM EDT |
The knockers of the OS philosophy are very fond of poo-poing the many eyes as
being useless because they are unskilled. This is of course true for some of the
eyes, however not for all of them. The code is open for inspection by everyone,
good guys, bad guys, security experts, thieves, hackers everyone. Many of the
best programmers alive are actively involved in the process as are many of the
best in other IT fields. DEfence (and offence) forces in many countries are
using Linux BSD and many other OS projects, they are also contributing to them.
I am certian military intelligence departments are also looking at them very
keenly although they are rather less likely to openly contibute! They will
howver presumably whisper to their masters that it may not be a good idea to use
them to guide missiles if they contain obvious backdoors.
I have never seen in any criticism of F/OSS any problems identified that are not
more obviously applicable to SSS (secret source software). SSS people can and do
inspect OSS and if they find any problems they will jump up and down and point
at them. What they cannot understand is that we OSS people will be genuinely
grateful whereas a SSS person will probably attempt to sue anyone who does the
same service for him. Every day and in every way OSS gets better. Somedays and
in some ways SSS gets better, as a result in time SSS software will be so far
behind that it will simply die.[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 03:51 PM EDT |
Just a few lazy Sunday afternoon thoughts by someone who has no treasonable
motives.
What is security? How much do you need? How much freedom are you prepared to
give up to get it?
Now the answer to these questions depends on who you are. Consider - a banking
system, a life-support system, a missile system, a web site, a word processor.
Now lots of people consider that Microsoft Windows provides adequate security
(as indicated by its large market share). Logically this means that (at least
for professional use) the costs associated with its well-known security defects
are less than the costs associated with the use of a more secure system; again,
the availability of more secure systems is well-known. (I presume that it is
well known, here in the UK the BBC always point out that the latest virus scare
only applies to Microsoft Windows users.) My bank sends credit cards through the
post, runs Microsoft Windows on its ATMs (I have seen one crash back to the M$
logo - I read that the problem was nationwide) and will even let you bank over
the Internet providing you use Microsoft Internet Explorer to do so (I don?t).
Security may be defined in many ways (again depending on who you are). It may,
for the life-support system, mean reliability; for the web site, immunity to
deliberate attack. In any case, there is the presumption that the system
actually performs its intended function. A chain is only as strong as its
weakest link; what is the use of the most reliable operating system and the most
reliable application if the window system keeps crashing? I was trying to write
a quick letter this morning using Microsoft Word under Win2k (all quite reliable
and safe behind a Linux firewall); complete frustration from beginning to end -
M$ Word got itself totally confused. Now I need to consider whether the pain of
sorting out a mess like this (and it always happens when you are in a hurry) is
greater than the pain of learning OO.org. (This is a rhetorical question, not a
troll.)
I was reading a book about the first manned moon mission. Wernher von Braun
makes his comment about the system consisting of a million parts all made by the
lowest bidder. This made me think, ?Why did it work?? The answer is, ?Because
everybody involved wanted it to work?. (Hardly a brilliant insight, the same
effect was behind the ?Zero Defects Program?.) Now what happens in most other
situations? Someone is under pressure to ship sub-standard goods, someone is
involved in dubious accounting, someone is known to be breaking the law and
getting away with it. In this sort of situation, employees become disillusioned
and will not give of their best, nor will they work together as a team; in fact
they may even want to see their employer fail.
I am a hardware engineer and as such am really rather appalled by the necessity
of a discussion on the security of software systems; after all, it is only a
sequence of instructions each having a precisely pre-determined action, the only
clever bit should be making it work around failures in the hardware. (Again, not
a troll.) We seemed to have wandered aimlessly (or have been mercilessly driven)
down a route where even the hardware (with its millions of transistors) is not
deterministic anymore. To my mind, any system should be sub-dividable into
sections having a clearly defined functions and having clearly defined inputs
and outputs. If, for reasons of economy of manufacture, the whole system is
eventually compiled into one huge binary file and one huge VHDL file, so be it,
but it must be possible to take it apart again! We once went through a phase
where microprocessor salesmen would advertise their products as being able to
run Microsoft Windows, this carried the implication that it might not because no
one actually knew what really happened inside the processor when this particular
piece of software was running. What was really happening of course was that the
design engineers were trying to build an equivalent of an Intel processor, the
workings of which they could never be allowed to see, to run a program with
ill-defined properties, the internal workings of which they would never be
allowed to see. (I think this situation was resolved by Microsoft starting to
test Windows software on non-Intel processors.) The x86 architecture is the
hardware equivalent of the Windows operating system, I had hoped that Linus
Torvalds was going to cure this with the Transmeta processor.
The whole hardware and software structure of the Wintel PC seems to be at odds
with the concept of security. On both the hardware and software levels, programs
and data are virtually interchangeable items. When we merely want to write a
letter or browse the Internet why do we use hardware and software that allows
the function of the computer to be permanently changed? The von Neuman
architecture of the x86 allows programs and data to reside in the same address
space whilst Windows extends this defect to the filing system.
Thus I find the current situation difficult on two counts. On the one hand the
industry is dominated by monolithic giants producing monolithic products behind
closed doors. On the other hand we have the open source community who at least
are holding the door open but are producing products which run on the still
fundamentally flawed products of said monoliths.
[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 10:31 PM EDT |
The biggest problem with Integrity, VxWorks, PSOS is that generally they suck
when it comes to doing anything other than what a RTOS is designed to do...so
using them isn't the right solution for nearly everything outside the embedded
world.
The Greenhills series of articles criticize SELinux and its insecurity. Never
mind that SELinux is not a RTOS Linux variant and lives in a different product
space.
They also totally ignore that DISA is going to a net centric architecture which
is pretty much insecure overall. Not to mention that many mission critical
legacy systems have been built on top of "insecure" unix variants
(solaris, hpux, aix, etc).
Therefore use of SELinux isn't any more or less secure than most other choices
except for maybe Solaris.
So the only reason they could be whining is because they are expecting
Montavista and other RTOS linux variants to steal market share. Except for the
fact that the reason that VxWorks and PSOS (and presumably Integrity) suck for
general purpose development is because they are streamlined for hard RT...and
Montavista and all the other RT Linuxes I've seen aren't. They perform about
the same as LynxOS...another unix like RTOS with relatively poor RTOS specs.
Which is why they have thier own RT Linux distro (BlueCat).
I've never seen LynxOS win a RTOS trade analysis except when you really weren't
hard real time anyway. The biggest problem they face is that many previous
"real time" applications weren't hard real time...but the "needs
to go fast enough" real time. Which you can do today with RT Linux and a
modern processor. You can hit 50ms deadlines this way assuming you take some
care and can afford a few hits here and there and catch up.[ Reply to This | # ]
|
|
Authored by: Anonymous on Sunday, July 04 2004 @ 11:44 PM EDT |
As a person having worked for a defense contractor, I found this comment
laughable:
"It's ridiculous," said Henderson of Mentor Graphics.
"Is he saying that he has no foreign employees? He has no one who could subvert
his code? He makes compilers that are used by the military. What's to stop one
of his employees from putting a backdoor into the code that's generated by the
compiler?"
Yes, he is saying that he has no foreign employees
working on the contract if the contract he has with the government calls for
that. This is why employee badges at defense contractors have easily
identifiable markings on them that help employees distinguish what type of
employee the person is. Foreign nationals are easily distinguished from U.S.
citizens with a secret clearence.
Secondly, you know what stops employees
from putting a backdoor in government software? Fear of the death penalty. Oh,
yes. Treason is one of the remaining crimes the U.S. Government still seeks
capital punishment for. This was made painfully clear to me when I became an
employee of a defense contractor. OSS just can't guarantee this.
[ Reply to This | # ]
|
- No, you don't get it... - Authored by: urzumph on Monday, July 05 2004 @ 01:12 AM EDT
- Exactly - Authored by: Anonymous on Monday, July 05 2004 @ 01:26 AM EDT
- Exactly, not! - Authored by: Anonymous on Monday, July 05 2004 @ 03:31 AM EDT
- Er, no... - Authored by: Anonymous on Monday, July 05 2004 @ 05:00 AM EDT
- Interesting to hear from the MIL side - Authored by: globularity on Monday, July 05 2004 @ 04:25 AM EDT
- And the bad guys are really scared of that... - Authored by: cricketjeff on Monday, July 05 2004 @ 07:45 AM EDT
- FOSS treasonous? Rediculous.. - Authored by: Anonymous on Monday, July 05 2004 @ 08:38 AM EDT
- Government secrecy is extremely detailed. - Authored by: Anonymous on Monday, July 05 2004 @ 11:22 AM EDT
- No, you don't get it... - Authored by: Anonymous on Monday, July 05 2004 @ 11:58 AM EDT
- No, YOU _obviously_ don't get it! - Authored by: tanstaafl on Monday, July 05 2004 @ 01:37 PM EDT
- No, you don't get it... - Authored by: Anonymous on Tuesday, July 06 2004 @ 12:34 AM EDT
- Too bad the military - Authored by: Tim Ransom on Tuesday, July 06 2004 @ 01:27 PM EDT
|
Authored by: bobn on Monday, July 05 2004 @ 12:27 AM EDT |
While in high school, Bill Gates and his friend Paul Allen wrote a scheduling
program for the school—which coincidentally placed the two in the same classes
as the prettiest girls in school. I believe this is in his book, "The Road
Ahead". Far from being ashamed of misusing the trust of his customers, he is
proud of it. So it should be no wonder taht MSN';s search engines mysteriously
have almost no hits for "Linux".
At the same time "Bill and his friends
changed files around to make it look like they weren’t on the computer as much
as they were. Their explorations and experiments banned them from the computer
for several weeks. " form
http://members.fortunecity.com/billgatesbio/bill_gates_m.htm
This is the man
that wants to sell us "Trustworthy COmputing"? Please, no.
--- IRC:
irc://irc.fdfnet.net/groklaw
the groklaw channels in IRC are not affiliated with, and not endorsed by,
either GrokLaw.net or PJ. [ Reply to This | # ]
|
|
Authored by: Anonymous on Monday, July 05 2004 @ 02:01 AM EDT |
As someone who has worked on hard real time military aerospace embedded systems
for twenty years (and air vehicle flight control is about as hard real time as
it gets), I have dealt with trustworthy computing issues for quite a while;
however, I suspect that plenty of the people in the Linux community have not
dealt with those same issues. (Some obviously have.) It has only been in the
last several years that you could even think about having a real OS at the core
of a hard real time embedded system, especially one that had to fly on an
aircraft. So the RTOSes that can pass muster for a hard real time embedded
system tend to be pretty lean-and-mean, non-POSIX, etc. I agree with Gene
Spafford: I wouldn't want to fly in an airplane that had Linux as its RTOS; I'd
much prefer Integrity-178B or VxWorks AE.
That said, there is an entire world of computing that doesn't have to be hard
real time, even in the defense world (command and control systems come
immediately to mind), and unfortunately Dan O'Dowd is trying to sow FUD into the
middle of that market. Too bad; his product can compete on its own merits in
the proper marketplace without casting unwarranted aspersions on other worthy
products.
Finally, I wonder if there was more to the Spafford comments? It was pretty
easy to see he had a reasonable position if one has some background in the
defense industry, but for those outside, some of the quotes looked inflammatory.
The original reporter could have helped a lot by providing better context.[ Reply to This | # ]
|
|
Authored by: Anonymous on Monday, July 05 2004 @ 04:17 AM EDT |
>Professor Irvine contrasts Linux with "'high-assurance' operating
systems with the smarts to prove that subverting code doesn't
exist"
Linux isn't an operating system. It's a kernel. Talking about
the security of "the Linux operating system" is about as useful as talking about
the performance of "the AMD PC" or for that matter "Microsoft Windows". Without
specifying exactly what distribution or combinations of components
you're talking about, it's meaningless.
I know that Mr Yodaiken isn't saying
that, but that he quotes others without correcting them doesn't give me a lot of
confidence that he knows what he's talking about. Clearly I'm not his
target audience, but even so, a token attempt to stick to the facts rather than
simply spinning in the other direction would be appreciated. [ Reply to This | # ]
|
|
Authored by: blacklight on Monday, July 05 2004 @ 06:31 AM EDT |
The military and the defense establishment have their own priorities. The fact
is that computers developed so fast that the military was forced to revise its
procurement procedures, for example. At one point, Sun workstation exports from
the US to the former Soviet Union were a strictly controlled item. The NSA and
its defense allies were instrumental in outlawing cryptographic research in the
United States, where needless to say, it is thriving offshore.
Counterproductive? You bet.
Much as I respect the intelligence and effectiveness of the military officers I
have known, I am very happy to be a civilian and to have the freedom to be
effective as a civilian. I believe that the Open Source model will in general
far outclass in speed and effectiveness anything that the military and their
contractors can develop on their own under their own procedures. When it comes
down to betting billions of dollars against brains, I'll put my money on brains
any day of the week: the brain is still the original, unmatched, evolutionary
weapon of mass destruction - and neither the military nor its contractors have a
monopoly on it.[ Reply to This | # ]
|
|
Authored by: Anonymous on Monday, July 05 2004 @ 07:54 AM EDT |
To briefly add some thoughts to this issue:
The June ACM Queue ran a number of interesting articles on software security.
http://www.acmqueue.org/
- One of the more interesting remarks: why not generate more secure code via a
compiler? Just as compilers nowadays do optimisation of code? This may help
discover certain 'programming oversights' (buffer overflow, not checking for
boundary conditions or empty variables, releasing memory, etc.).
- Another issue addressed: the role of trust. Ordinary users usually need to be
sys-admin on their home machines (e.g. when you're running Win XP; the lower
setting is unworkable for our (simple) home system!), which also means that,
e.g., your web-browser runs in sys-admin mode. Better trust management, and
sand-boxing of lower-level-trusted applications, may also prevent the scale and
proliferation of security issues. This, unfortunately, would require a major
redesign of, e.g., Win XP. Linux and MacOS fare better, fortunately.
And based on some thinking while driving:
A new programming paradigm may be needed to cope with designing and implementing
software systems which have to be 'secure' (according to some specification)
while their environment (the real world) changes so much and so fast... A
service-oriented, layered, approach may be a more manageable solution.
Although the open source concept is promising, I know I do not have the time
(nor the detailed expertise) to check through, e.g., Linux code portions to
check for security issues. How to guarantee that the right 'eyeballs' are
focussed? And not just on code in isolation, but at the combination
(aggregation) of software components and modules which make up moderns software
systems? How to praise people's efforts?
And that's enough thinking on this topic,
best regards to all Groklaw volunteers and readers,
Niek Wijngaards
PS: This is quoting from memory, I may be completely and utterly wrong, all
opinions are mine. Oh, and IANAL.[ Reply to This | # ]
|
|
Authored by: Anonymous on Monday, July 05 2004 @ 08:12 AM EDT |
I will just point out a series of articles made by Dan O’Dowd, CEO of Green
Hills Software, Inc. The name of the series is "Linux Security
Controversy"
FAA Safety-critical
Certified
Operating Systems Deliver The Reliability and Security Required by
Defense
Systems; Linux Does Not
"Many Eyes" - No
Assurance
Against Many Spies
Linux
Security: Unfit for
Retrofit
When it comes to security,
it is very important not to be "religious" about the
different systems. Even
as a long time linux user (since before version 1.0 of
the kernel), I have no
problems accepting the fact that there are situations
where linux is not the
best choise. And when all comes to all, that is what
they are saying...
--
Mike Andersen
mike@src.no [ Reply to This | # ]
|
|
Authored by: Anonymous on Monday, July 05 2004 @ 11:38 AM EDT |
More total MS imperialist BS [ Reply to This | # ]
|
|
Authored by: Anonymous on Monday, July 05 2004 @ 02:11 PM EDT |
The New York Times wrote an article today (Monday 7/5) that seemed on the
surface to imply Democrat John Kerry was supporting open source software.
Unfortunately, after reading the article, you see that Kerry's campaign, like
most webpages, uses LAMP.
This was the most misleading bit of fluff and might actually turn some
Republicans against open source.
I am sure members of both parties use LAMP for their websites, and we should
avoid supporting one party over the other until one adds Linux / FOSS /
anti-software patents to their party platform.
We should also look for instances of Microsoft supplying free software ini
violation of the campaign constribution statues.[ Reply to This | # ]
|
|
Authored by: Anonymous on Monday, July 05 2004 @ 02:41 PM EDT |
I am an embedded system designer. The real world is not a PC, resources are
limited. Responsible designers actually look at what they put into a system. If
a section of code it is not necessary, or is obfuscated, it doesn't get
included. That's why open source solutions like Linux are so attractive, they
can be stripped to just the bare minimum required to support the application.
If you really wanted to slip something into a system, you wouldn't necessarily
do it in the source code anyway. Slipping it in through the compiler, linker
etc... would be a lot more stealthy. Unless the project is small most people do
not take the time to examine the assembly or machine code emitted. For that
matter even simple things like makefiles are to often taken for granted. So, if
your entire tool chain is not certified and locked tight, you are kidding
yourself anyway.
Besides, system designers are more interested and concerned about stochastic
behavior of the O/S. (MS products suck in this regard). It seems only PC running
MS products get a "pass" when they crash for no apparent reason.[ Reply to This | # ]
|
|
Authored by: Anonymous on Monday, July 05 2004 @ 02:57 PM EDT |
It is not for the faint-of-heart.
Ref:
http://www.ghs.com/linux/unfit.html
Just a few quotes from "Part III - Linux Security: Unfit for
Retrofit"
"...But Linux's security problems can't be fixed, they are
systemic and long standing. There is no way to fix Linux to bring it up to the
level of security that is required for national defense systems, a level that is
already available in proprietary operating system..."
"Proprietary
operating system" - remember that phrase, it'll come up again...
"...If
secrecy isn't important to security, then why does Linus Torvalds keep the means
of accessing the core Linux development tree a secret from all but a few people?
Because if he published the details of his defenses, some jerk would break in
and screw up the Linux development
effort..."
huh?
"...Thousands of vulnerabilities escape
source level scrutiny of Linux every year..."
"thousands"? "every
year"? What?
"...It is a fundamental tenet of the open source movement
that Linux source code will always be published for everyone to see. It will
always be easy for our enemies to study Linux at their leisure in order to find
ways to exploit the many vulnerabilities that exist in every version of
Linux..."
"many vulnerabilites"..."in every version".. Oh. That's
right. You just proved that, above.
"..."Security through obscurity" is
a derisive slogan invented by the open source community to describe the practice
of hiding the source code of sloppy software to prevent attackers from finding
the vulnerabilities. Many open source advocates claim that anyone who doesn't
publish their source code does so only to hide the terrible quality of their
code, but this is just mindless prejudice..."
"Mindless prejudice"?
Let's see, read any recent examples of that? Is this some sort of
self-referential recursion? Are we falling into an infinite loop,
here?
"...Some people have argued that defense contractors could avoid
the largest security vulnerabilities of Linux (i.e. that much of Linux has been
developed offshore by unknown personnel and that the source code must be made
public exposing its capabilities and vulnerabilities to
attackers)..."
Yeah, dammit, it's all the fault of those great
unwashed colored masses that just happen to inhabit the rest of this planet. Too
bad the human race isn't limited to white, anglo-saxon males. (hmm.. That would
cause a technical problem though, wouldn't it?).
"...No self respecting
first-rate open source programmer would take a job building "proprietary"
software for the military..."
*puff* *puff* *puff*
Hang on: We're
building to a climax (orgasm?) of self-serving rhetoric:
"...Linux is
huge, it is not real-time, it is not well documented, and it has huge security
vulnerabilities. Defense contractors gain nothing but trouble by starting with
Linux. They should start with a small, fast, royalty-free, proven reliable, well
documented, secure real-time operating system developed under tight controls
that has been deployed in hundreds of military systems already: our INTEGRITY
real-time operating system..."
Oh! Gee! Surprise,
surprise...
Not only is this clown not a particularily credible shill, he's
not even a particularly good salesman.
Robert Enderle, step aside: you've
met your match, and his name is O'Dowd.
t_t_b --- Release the missing
Exhibits! [ Reply to This | # ]
|
|
Authored by: reuben on Monday, July 05 2004 @ 05:03 PM EDT |
Hi folks,
I just noticed that some significant things have
disappeared from SCO's public FTP site at ftp.sco.com. The directory /pub/LTP/SRPMS contained source
packages for Caldera's "Linux Technology Preview" (review
at Linux Today), based on an early Linux-2.4 kernel. It is now (July
5) empty. I most recently downloaded the relevant packages on June 5, but I
checked that they were present in the middle of June at least. Can anbody else
confirm that the packages were present more recently than June 5 or disappeared
earlier than July 5?
Interesting
packages:
- linux-2.4.0-0t3f2.src.rpm - an early 2.4 kernel. SCO
claimed to have ceased distributing these many months ago. I have a local copy
from June 5.
- kernel-addon-2.4.0-2.src.rpm - contains kernel
patches, including LiS-2.9beta4.tgz. SCO claimed never to have distributed LiS
with any Linux product. I have a local copy of this from June 5.
- Versions of binutils, glibc, and sysvinit, which were mentioned in Exhibit
28-G of the Declaration of Todd Shaughnessy (pdf) as no longer being
distributed by SCO. I don't have local copies of these packages from the SCO
ftp site.
Perhaps others have archived the whole directory; the
files in it prove that almost all of the information in Exhibit 28-G is false.
I hope and assume that IBM's lawyers already know all about it. Also, no need
to worry - SCO is still distributing Linux kernel sources by public FTP, just
not that particular one any more. Given their history I expect it's impossible
for SCO to really know what they have distributed and to whom, but I am
surprised about just how careless they are even now.
Interestingly,
there is at least one case where SCO is apparently distributing a Linux kernel
with no source. Anybody owning copyright to kernel code who would like to know
where to find this is welcome to contact me by clicking on my username, with
"Groklaw" in the subject line.
[ Reply to This | # ]
|
|
Authored by: Wesley_Parish on Monday, July 05 2004 @ 09:49 PM EDT |
Squashing Bugs at
the Source should perhaps provide an antidote for this sort of strong
medicine.
It makes sense - if security holes are bugs writ large,
and there are now tools in the Free/Libre Open Source Software community's
toolbox to squash bugs, then that argument of Dowd's in EE
Times article is so much methane
afflatus. --- finagement: The Vampire's veins and Pacific torturers
stretching back through his own season. Well, cutting like a child on one of
these states of view, I duck [ Reply to This | # ]
|
|
Authored by: RealProgrammer on Tuesday, July 06 2004 @ 12:23 PM EDT |
I don't want a secure operating system. If people believe their operating
system is "secure", they will be susceptible to attack both from
unknown flaws in the operating system and from other areas of their information
systems that they assume are protected by their "secure" operating
system, but are in fact vulnerable.
Strictly speaking, operating systems are not "secure". People have
security; operating systems are "trusted".
That's not mere pedantic semanticism on my part. Security is an emotion: we
feel secure, we have a security blanket. Computers and software do not have
emotions (at least, not yet).
In the military we say an area is "secure" not if it is known
absolutely to be free of danger, but if we believe, within the acceptable level
of effort, that the dangers in it are in the acceptable range and contained in
the area so secured. After an area is secured, we post guards to monitor the
area and alert us to a change.
In information security we define the acceptable level of risk and the
acceptable level of effort required to manage that risk. Any system subject to
use is vulnerable to misuse. With that in mind, we can say a system is
"secure" if the dangers it may pose to us are in an acceptable range
given currently practical methods of ascertaining risk to the things we want to
protect. What we are really saying is that we feel secure, or that at the time
we checked it out, that area of our information system was relatively free of
danger.
It's not possible to achieve complete trust in an operating system.
Specification documents, trusted languages, etc., are all exercises in
tail-chasing. Are the specifications and test suites perfect? Are they
provably flawless for all inputs? Of course not. They are at best provably
flawless for the environments we expect them to encounter. We don't know what
we don't know.
Our task, then, is to minimize our risks and to contain the damage caused by
failure. We must do this while maximizing the system's usefulness and keeping
within acceptable levels of effort. Sometimes we achieve a win-win, in which
security, robustness, and ease of use are all enhanced by some method that also
costs less. Sometimes we have to sacrifice one goal to achieve another.
It's generally accepted by security professionals that we have to put up
multiple layers of defense and keep an active watch on our defenses. An
operating system is just one layer. We err when we label an OS
"secure", because we cause ourselves and others to relax our
vigilance.
---
(I'm not a lawyer, but I know right from wrong)[ Reply to This | # ]
|
|
|
|
|