OSWebSec: Fundamentals

From Soma-notes
Jump to navigation Jump to search

RSCS September 18, 2012

Our CEO Mr. C had a dream the other day – I think we've been telling everyone to do things wrong everyone takes our advise then they have broken system

we need people to rethink what to do about it

start from the beginning adn see if they come up with anything

Mr. C's representative is Mr. A – from marketing – can tell what products have succeeded and what products have failed. Otherwise they won't pay us.

We risk losing customers if we secure things

we have to have a good story about how we are going to secure things.

Two guys Saltzer and Schroder – securing information is easy unless you never want to share it with anyone. People like sharing things. If you were to tell the ceos from the other companies – they like getting their youtube and cat pictures – they like sharing. I don't think we can pull them off of the internet. Design principles – least privilege – the users who follow those o/s follow it more closely. The main dominant operating system hasn't implemented it the same.

I want to open those files – I want to see those design documents. It means there's an extra step. Asking for permissions. Just for tasks that are more sensitive. Modifying sensitive information. Least privilege – just the things that they need for their job. Give the CEO the root password? I like privileges. All the credit card companies say I should want privileges – this is not good marketing. Why do they care?

Phishing attacks

DOS

how did these guys tell you how to keep the website up?

SSL – but it doesn't work? Is SSL part of what these guys are telling us to use? They had handshaking – mechanism. Security between login terminal and the system. Secure channel. Primitive version of SSL. Transport layer security – we should use something similar to SSL to secure our connections.

Other ideas to secure: Design principles:

economy of mechanism least privilege open design psychological acceptability work factor fail safe defaults complete mediation separation of privilege least common mechanism


a theme – seems so un-fun!

We've been doing this for 40 years – this has potential – we have to tell them again – say it a different way.

Implementations of design principles:

Access Control Lists – least privilege, complete mediation, fail safe defaults, separation of privilege

Some of the new stuff has holes too – got in through the security mechanisms. If we told them to start over? How many problems are caused by legacy code?

Various mechanisms how to instantiate this? Just have to build this and then it is okay. Verification? Formal proof in the state of the art section. But this has it's issues as well. What am I telling them they are trying to stop? \ What are the big security problems my customers talk about?

Unauthorized access lose customer information / trade secrets / financial information lose money / embezzlement data loss uptime

ACL – policies for saying people shouldn't write cheques to themselves, to buy themselves a new pair of shoes -

limiting the scope of damage instead of stopping the damage from happening.

Audit – someone should look at the audit trail. Who looks at the audit trail for your phone? I don't think that helps

the various attacks that systems fail to are all about getting past the barriers and getting past these mechanisms – doesn't necessarily mean that we are secure.

Do you see the point of the exercise? I was hoping the class would get less bleak? Saltzer and schroeder paper – classic piece of literature

folks who consolodated it – this is the pattern everywhere – we knew how to build secure systems in teh 70s but p[eople didn't listen to us. We figured it out 30 years ago why didn't you listen to us then?

They have some advice – read through it – all sounds completely reasonable. Capabilities are the cool mechanisms that we keep talking about – look into the systems – omg that's so annoying. We have all these security mechanisms they describe. They can in principle work. But they are not necessarily tied to problems we have. These design principles are really about – something has broken somewhere in the system – so let's try to limit the damage that they can do from there. Someone's broken into your house, let's keep them in the main hallway.

It's not just about compartmentalizing the threats. Any compartment you have might slow you down a bit. But then there's an alarm that sounds. You know something's wrong, people come along to look at the problem. There is a response. Let me limit the containers. There is a person sitting there watching the thing like a hawk, and noticing anything bad that may happen. Principles that came up with in the 60s and were written about in the 70s. You literally had an army of people per computer. They were more expensive back then.

Reference monitor concept – you should look at what their story is, what they say, and also what they don't say. Anytime you do security there is a threat model – you always try to secure against specific threats. There is implicitly certain kinds of threats that they are thinking about.

What threats are they thinking about?

Spies – who are the one's funding this? - the military – they are thinking about computers that hold secrets – what do you want to do? Make sure the secrets don't go out. That's why they are talking about covert channels – maybe some untrusted program is running in the context of the secure information. If it can access the disk in a certain pattern, another program will be running (often involve resource sharing – communication where there isn't supposed to be communication. Think of two prisoners in separate cells but they are not supposed to communicate – they knock on the wall, make it sound proof, slow down the guard by talking to them. If you slow them down by 10 seconds – it means something – that's the kind of thing – little things that can talk about small pieces of information. Is it worth the effort – other covert channels can be quite large. Disk timing – all kinds of information from other parts of the system. Measuring power consumption. Some channels are more significant than others. Find some computation and make sure it's not communicating with someone else.

All talking about secrets. Of the stuff we are talking about, a different kind of secrets. I want to keep a database secure. I don't want customer information to leak. But the lessons of covert channels shows – information has a way of getting out. Especially if you want to be interacting and sharing. If you are constantly using it. Database online, constantly being accessed – that data can get out. They recognize that. Can we come up with mechanisms – simple mechanisms – implementation complexity – talking about computers – really slow, very small computers – they don't have a lot of memory or computational time to come up with things. Simple policies separating components so they don't blab. Availability stuff – uptime? Where is that in here? If anything – all of these make things go a lot slower. They want perfect mechanisms. This is military threat model – security has inherited that today. But their world is about don't want to share information. Now our model is share share share. Now it's share but don't share. Digital rights management – I'm going to give this to you, but I still want it to kind of be secret. I only want this information to be used in certain ways. Not that these ideas are bad, persay. Some element of data – but why? They are adapted from non computer based security.

Key component – people are implementing the mechanisms – when you introduce people – computers are too slow, too small, you were happy when the computer just ran. This is from a world where – taking world of human security – assuming people are really a part of the system. This will hopefully give them some time that they will notice that there is some problems. Most important – increasing the work factor. Making it hard enough so that they have to do many things. Each mechanism that they have to break is potentially something that can be noticed. People are there to notice when the compartments break. The humans are here to make all this done, humans verify open design. Complete mediation – observe all the parts of interaction.