<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://homeostasis.scs.carleton.ca/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Nicholas+Laws</id>
	<title>Soma-notes - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://homeostasis.scs.carleton.ca/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Nicholas+Laws"/>
	<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php/Special:Contributions/Nicholas_Laws"/>
	<updated>2026-05-02T09:54:24Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.42.1</generator>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20927</id>
		<title>SystemsSec 2016W Lecture 23</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20927"/>
		<updated>2016-04-06T04:30:19Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: /* Paper: Android Permissions Remystified */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Topics and Readings==&lt;br /&gt;
*Boxify&lt;br /&gt;
**Michael Backes et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/backes Boxify: Full-fledged App Sandboxing for Stock Android]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
*Android Permissions&lt;br /&gt;
**Primal Wijesekera et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/wijesekera Android Permissions Remystified: A Field Study on Contextual Integrity]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
&lt;br /&gt;
==Notes==&lt;br /&gt;
&lt;br /&gt;
===Midterm Discussion===&lt;br /&gt;
  • Midterms almost all marked&lt;br /&gt;
  • Midterms will be returned on Thursday (April 7th), on average people did badly, we will discuss them in Thursday’s class&lt;br /&gt;
  • Question 1 was answered best overall, Anil had issues believing people had actually used the system before when they failed to supply enough detail&lt;br /&gt;
  • Question 2, most people just did not address all aspects of the question. Or argued for things that just were not true.&lt;br /&gt;
      o	Ex. Very few OS are verified, but lots of people claimed they were.&lt;br /&gt;
  • Question 3 also had several problems, he was extremely lenient with what qualified as a system (nowhere did the question say it had to be a computer system)&lt;br /&gt;
  • Example System: A Man carrying a suitcase full of cash&lt;br /&gt;
      o	Threat #1: Someone will steal the case&lt;br /&gt;
        -  Defense:  Get a bodyguard&lt;br /&gt;
            • Vulnerability: Guard could be bribed or could abandon you&lt;br /&gt;
      o	Threat #2: Hyperinflation reduces value of case contents to nothing&lt;br /&gt;
        -  Defense: Banks/Mints&lt;br /&gt;
            • Vulnerability: Currency minting plates get stolen&lt;br /&gt;
  • General Comment: FOLLOW THE FULL INSTRUCTIONS, BE SPECIFIC.&lt;br /&gt;
  • Concerns of time pressure leading to Anil thinking of 4 questions for the final&lt;br /&gt;
  • Anil tried to figure out exactly what problems people had with midterm&lt;br /&gt;
  • People cited time pressure, not knowing what to study, and having issues writing “essays”&lt;br /&gt;
  • Anil pointed out that for Q2 in particular, the textbook covered the elements of a secure OS at the end of every chapter, with specific examples.&lt;br /&gt;
  • Anil reinforced the idea that PEOPLE NEED TO READ THE QUESTION, AND ANSWER EVERY COMPONENT.&lt;br /&gt;
  • What are we supposed to get out of the papers?&lt;br /&gt;
    o Learn the patterns in the papers&lt;br /&gt;
    o If a paper says “x” is secure, you should be asking what the threat model is, where is the proof the attack/defense works like it is explained, how do you know?&lt;br /&gt;
    o Trying to learn the proper way to think about security related problems&lt;br /&gt;
  • Example question related to papers (something similar to this is likely to be present on the exam):&lt;br /&gt;
    o For an attack paper, discuss x, y, z.&lt;br /&gt;
    o For a defense paper, discuss x,y,z.&lt;br /&gt;
  • People complained about vagueness and lack of structure, Anil reminded everyone that after you graduate, very few things are well defined or structured.&lt;br /&gt;
  • Ask yourself what you have learned, and use it to answer the exam questions.&lt;br /&gt;
  • Your success in learning is shown best via critical thinking.&lt;br /&gt;
  • PLEASE READ THE QUESTIONS FULLY!!!&lt;br /&gt;
    o And review your answer to make sure you address every single thing he asked.&lt;br /&gt;
&lt;br /&gt;
===Paper: Boxify===&lt;br /&gt;
&lt;br /&gt;
  • Sandboxes applications&lt;br /&gt;
  • It builds a reference monitor for individual applications&lt;br /&gt;
    o This makes up for issues in the base Android monitor&lt;br /&gt;
  • Paper discusses OS and Application Modifications:&lt;br /&gt;
    o OS Modifications:&lt;br /&gt;
      - Requires flashing the OS, not very accessible or easy for users&lt;br /&gt;
    o Application Modifications:&lt;br /&gt;
      - No boundary between reference monitor and application at base&lt;br /&gt;
        • Full mediation and tamp reproofing not possible&lt;br /&gt;
  • Uses a new(ish) Android mechanism for loading an application in a fully isolated process, then slowly implementing functionality/permissions using a reference monitor in a different process&lt;br /&gt;
    o This grants (mostly) full mediation (misses Kernel Interface) and a decent amount of tamper proofing&lt;br /&gt;
  • Android is based on system calls and intents&lt;br /&gt;
  • Boxify shows several hallmarks of secure OS architecture; the reference monitor implementing a security policy&lt;br /&gt;
  • Trusting only a small portion of the system to be secure, vs. trusting the entire system to be secure&lt;br /&gt;
  • Is the chosen strategy a good one?&lt;br /&gt;
    o Assuming Boxify is the very first app you install, sure.&lt;br /&gt;
  • Boxify re-implements the entire Android permission model&lt;br /&gt;
  • Ideally a reference monitor has little to no functionality&lt;br /&gt;
  • Boxify “fails” safely, if it is forced to close the instances in the sandbox “starve” and die, since they only mechanism to interact with them (Boxify reference monitor) has been closed.&lt;br /&gt;
&lt;br /&gt;
===Paper: Android Permissions Remystified===&lt;br /&gt;
  • Applications access private information more than the user expects&lt;br /&gt;
  • Appls should stay within context, ex. A map application can have the user’s location, but only when the user is actively using it, not when it is in the background.&lt;br /&gt;
  • Bad practice to just ask user for permission once on install/first run, contexts change.&lt;br /&gt;
  • Some students found it surprising that 30% of the people from the user survery didn’t care how their info was used/misused&lt;br /&gt;
  • One student expressed views that user studies were not trustworthy and that they refused to look a tthem.&lt;br /&gt;
  • Why are user studies important?&lt;br /&gt;
    o Security Engineers keep building tools users cannot use (or cannot use properly)&lt;br /&gt;
    o They are a “club” to hit people with when they think they know the user’s needs better than the user does&lt;br /&gt;
    o Studies used to prove that “X” is a good/bad idea&lt;br /&gt;
    o Used to backup claims / results&lt;br /&gt;
  • Do application developers want suers to properly understand how permissions work?&lt;br /&gt;
    o No. They make more add revenue in the current system.&lt;br /&gt;
&lt;br /&gt;
===Anil: &amp;quot;Where the research is&amp;quot;===&lt;br /&gt;
  • Basing a security system on reference monitors or raw cryptography is silly&lt;br /&gt;
    o The concept of reference monitors is broken&lt;br /&gt;
    o Cryptography is “fragile”&lt;br /&gt;
  • Idea of diverse implementations so that 1 threat or attack does not compromise all systems&lt;br /&gt;
    o Proof of Concept: Living Systems&lt;br /&gt;
  • Computers CANNOT blindly trust anyone.&lt;br /&gt;
  • Security is a trade off with functionality&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20926</id>
		<title>SystemsSec 2016W Lecture 23</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20926"/>
		<updated>2016-04-06T04:29:31Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: /* Anil: &amp;quot;Where the research is&amp;quot; */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Topics and Readings==&lt;br /&gt;
*Boxify&lt;br /&gt;
**Michael Backes et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/backes Boxify: Full-fledged App Sandboxing for Stock Android]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
*Android Permissions&lt;br /&gt;
**Primal Wijesekera et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/wijesekera Android Permissions Remystified: A Field Study on Contextual Integrity]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
&lt;br /&gt;
==Notes==&lt;br /&gt;
&lt;br /&gt;
===Midterm Discussion===&lt;br /&gt;
  • Midterms almost all marked&lt;br /&gt;
  • Midterms will be returned on Thursday (April 7th), on average people did badly, we will discuss them in Thursday’s class&lt;br /&gt;
  • Question 1 was answered best overall, Anil had issues believing people had actually used the system before when they failed to supply enough detail&lt;br /&gt;
  • Question 2, most people just did not address all aspects of the question. Or argued for things that just were not true.&lt;br /&gt;
      o	Ex. Very few OS are verified, but lots of people claimed they were.&lt;br /&gt;
  • Question 3 also had several problems, he was extremely lenient with what qualified as a system (nowhere did the question say it had to be a computer system)&lt;br /&gt;
  • Example System: A Man carrying a suitcase full of cash&lt;br /&gt;
      o	Threat #1: Someone will steal the case&lt;br /&gt;
        -  Defense:  Get a bodyguard&lt;br /&gt;
            • Vulnerability: Guard could be bribed or could abandon you&lt;br /&gt;
      o	Threat #2: Hyperinflation reduces value of case contents to nothing&lt;br /&gt;
        -  Defense: Banks/Mints&lt;br /&gt;
            • Vulnerability: Currency minting plates get stolen&lt;br /&gt;
  • General Comment: FOLLOW THE FULL INSTRUCTIONS, BE SPECIFIC.&lt;br /&gt;
  • Concerns of time pressure leading to Anil thinking of 4 questions for the final&lt;br /&gt;
  • Anil tried to figure out exactly what problems people had with midterm&lt;br /&gt;
  • People cited time pressure, not knowing what to study, and having issues writing “essays”&lt;br /&gt;
  • Anil pointed out that for Q2 in particular, the textbook covered the elements of a secure OS at the end of every chapter, with specific examples.&lt;br /&gt;
  • Anil reinforced the idea that PEOPLE NEED TO READ THE QUESTION, AND ANSWER EVERY COMPONENT.&lt;br /&gt;
  • What are we supposed to get out of the papers?&lt;br /&gt;
    o Learn the patterns in the papers&lt;br /&gt;
    o If a paper says “x” is secure, you should be asking what the threat model is, where is the proof the attack/defense works like it is explained, how do you know?&lt;br /&gt;
    o Trying to learn the proper way to think about security related problems&lt;br /&gt;
  • Example question related to papers (something similar to this is likely to be present on the exam):&lt;br /&gt;
    o For an attack paper, discuss x, y, z.&lt;br /&gt;
    o For a defense paper, discuss x,y,z.&lt;br /&gt;
  • People complained about vagueness and lack of structure, Anil reminded everyone that after you graduate, very few things are well defined or structured.&lt;br /&gt;
  • Ask yourself what you have learned, and use it to answer the exam questions.&lt;br /&gt;
  • Your success in learning is shown best via critical thinking.&lt;br /&gt;
  • PLEASE READ THE QUESTIONS FULLY!!!&lt;br /&gt;
    o And review your answer to make sure you address every single thing he asked.&lt;br /&gt;
&lt;br /&gt;
===Paper: Boxify===&lt;br /&gt;
&lt;br /&gt;
  • Sandboxes applications&lt;br /&gt;
  • It builds a reference monitor for individual applications&lt;br /&gt;
    o This makes up for issues in the base Android monitor&lt;br /&gt;
  • Paper discusses OS and Application Modifications:&lt;br /&gt;
    o OS Modifications:&lt;br /&gt;
      - Requires flashing the OS, not very accessible or easy for users&lt;br /&gt;
    o Application Modifications:&lt;br /&gt;
      - No boundary between reference monitor and application at base&lt;br /&gt;
        • Full mediation and tamp reproofing not possible&lt;br /&gt;
  • Uses a new(ish) Android mechanism for loading an application in a fully isolated process, then slowly implementing functionality/permissions using a reference monitor in a different process&lt;br /&gt;
    o This grants (mostly) full mediation (misses Kernel Interface) and a decent amount of tamper proofing&lt;br /&gt;
  • Android is based on system calls and intents&lt;br /&gt;
  • Boxify shows several hallmarks of secure OS architecture; the reference monitor implementing a security policy&lt;br /&gt;
  • Trusting only a small portion of the system to be secure, vs. trusting the entire system to be secure&lt;br /&gt;
  • Is the chosen strategy a good one?&lt;br /&gt;
    o Assuming Boxify is the very first app you install, sure.&lt;br /&gt;
  • Boxify re-implements the entire Android permission model&lt;br /&gt;
  • Ideally a reference monitor has little to no functionality&lt;br /&gt;
  • Boxify “fails” safely, if it is forced to close the instances in the sandbox “starve” and die, since they only mechanism to interact with them (Boxify reference monitor) has been closed.&lt;br /&gt;
&lt;br /&gt;
===Paper: Android Permissions Remystified===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
•	Applications access private information more than the user expects&lt;br /&gt;
•	Appls should stay within context, ex. A map application can have the user’s location, but only when the user is actively using it, not when it is in the background.&lt;br /&gt;
•	Bad practice to just ask user for permission once on install/first run, contexts change.&lt;br /&gt;
•	Some students found it surprising that 30% of the people from the user survery didn’t care how their info was used/misused&lt;br /&gt;
•	One student expressed views that user studies were not trustworthy and that they refused to look a tthem.&lt;br /&gt;
•	Why are user studies important?&lt;br /&gt;
o	Security Engineers keep building tools users cannot use (or cannot use properly)&lt;br /&gt;
o	They are a “club” to hit people with when they think they know the user’s needs better than the user does&lt;br /&gt;
o	Studies used to prove that “X” is a good/bad idea&lt;br /&gt;
o	Used to backup claims / results&lt;br /&gt;
•	Do application developers want suers to properly understand how permissions work?&lt;br /&gt;
o	No. They make more add revenue in the current system.&lt;br /&gt;
&lt;br /&gt;
===Anil: &amp;quot;Where the research is&amp;quot;===&lt;br /&gt;
  • Basing a security system on reference monitors or raw cryptography is silly&lt;br /&gt;
    o The concept of reference monitors is broken&lt;br /&gt;
    o Cryptography is “fragile”&lt;br /&gt;
  • Idea of diverse implementations so that 1 threat or attack does not compromise all systems&lt;br /&gt;
    o Proof of Concept: Living Systems&lt;br /&gt;
  • Computers CANNOT blindly trust anyone.&lt;br /&gt;
  • Security is a trade off with functionality&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20925</id>
		<title>SystemsSec 2016W Lecture 23</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20925"/>
		<updated>2016-04-06T04:29:24Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: /* Anil: &amp;quot;Where the research is&amp;quot; */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Topics and Readings==&lt;br /&gt;
*Boxify&lt;br /&gt;
**Michael Backes et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/backes Boxify: Full-fledged App Sandboxing for Stock Android]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
*Android Permissions&lt;br /&gt;
**Primal Wijesekera et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/wijesekera Android Permissions Remystified: A Field Study on Contextual Integrity]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
&lt;br /&gt;
==Notes==&lt;br /&gt;
&lt;br /&gt;
===Midterm Discussion===&lt;br /&gt;
  • Midterms almost all marked&lt;br /&gt;
  • Midterms will be returned on Thursday (April 7th), on average people did badly, we will discuss them in Thursday’s class&lt;br /&gt;
  • Question 1 was answered best overall, Anil had issues believing people had actually used the system before when they failed to supply enough detail&lt;br /&gt;
  • Question 2, most people just did not address all aspects of the question. Or argued for things that just were not true.&lt;br /&gt;
      o	Ex. Very few OS are verified, but lots of people claimed they were.&lt;br /&gt;
  • Question 3 also had several problems, he was extremely lenient with what qualified as a system (nowhere did the question say it had to be a computer system)&lt;br /&gt;
  • Example System: A Man carrying a suitcase full of cash&lt;br /&gt;
      o	Threat #1: Someone will steal the case&lt;br /&gt;
        -  Defense:  Get a bodyguard&lt;br /&gt;
            • Vulnerability: Guard could be bribed or could abandon you&lt;br /&gt;
      o	Threat #2: Hyperinflation reduces value of case contents to nothing&lt;br /&gt;
        -  Defense: Banks/Mints&lt;br /&gt;
            • Vulnerability: Currency minting plates get stolen&lt;br /&gt;
  • General Comment: FOLLOW THE FULL INSTRUCTIONS, BE SPECIFIC.&lt;br /&gt;
  • Concerns of time pressure leading to Anil thinking of 4 questions for the final&lt;br /&gt;
  • Anil tried to figure out exactly what problems people had with midterm&lt;br /&gt;
  • People cited time pressure, not knowing what to study, and having issues writing “essays”&lt;br /&gt;
  • Anil pointed out that for Q2 in particular, the textbook covered the elements of a secure OS at the end of every chapter, with specific examples.&lt;br /&gt;
  • Anil reinforced the idea that PEOPLE NEED TO READ THE QUESTION, AND ANSWER EVERY COMPONENT.&lt;br /&gt;
  • What are we supposed to get out of the papers?&lt;br /&gt;
    o Learn the patterns in the papers&lt;br /&gt;
    o If a paper says “x” is secure, you should be asking what the threat model is, where is the proof the attack/defense works like it is explained, how do you know?&lt;br /&gt;
    o Trying to learn the proper way to think about security related problems&lt;br /&gt;
  • Example question related to papers (something similar to this is likely to be present on the exam):&lt;br /&gt;
    o For an attack paper, discuss x, y, z.&lt;br /&gt;
    o For a defense paper, discuss x,y,z.&lt;br /&gt;
  • People complained about vagueness and lack of structure, Anil reminded everyone that after you graduate, very few things are well defined or structured.&lt;br /&gt;
  • Ask yourself what you have learned, and use it to answer the exam questions.&lt;br /&gt;
  • Your success in learning is shown best via critical thinking.&lt;br /&gt;
  • PLEASE READ THE QUESTIONS FULLY!!!&lt;br /&gt;
    o And review your answer to make sure you address every single thing he asked.&lt;br /&gt;
&lt;br /&gt;
===Paper: Boxify===&lt;br /&gt;
&lt;br /&gt;
  • Sandboxes applications&lt;br /&gt;
  • It builds a reference monitor for individual applications&lt;br /&gt;
    o This makes up for issues in the base Android monitor&lt;br /&gt;
  • Paper discusses OS and Application Modifications:&lt;br /&gt;
    o OS Modifications:&lt;br /&gt;
      - Requires flashing the OS, not very accessible or easy for users&lt;br /&gt;
    o Application Modifications:&lt;br /&gt;
      - No boundary between reference monitor and application at base&lt;br /&gt;
        • Full mediation and tamp reproofing not possible&lt;br /&gt;
  • Uses a new(ish) Android mechanism for loading an application in a fully isolated process, then slowly implementing functionality/permissions using a reference monitor in a different process&lt;br /&gt;
    o This grants (mostly) full mediation (misses Kernel Interface) and a decent amount of tamper proofing&lt;br /&gt;
  • Android is based on system calls and intents&lt;br /&gt;
  • Boxify shows several hallmarks of secure OS architecture; the reference monitor implementing a security policy&lt;br /&gt;
  • Trusting only a small portion of the system to be secure, vs. trusting the entire system to be secure&lt;br /&gt;
  • Is the chosen strategy a good one?&lt;br /&gt;
    o Assuming Boxify is the very first app you install, sure.&lt;br /&gt;
  • Boxify re-implements the entire Android permission model&lt;br /&gt;
  • Ideally a reference monitor has little to no functionality&lt;br /&gt;
  • Boxify “fails” safely, if it is forced to close the instances in the sandbox “starve” and die, since they only mechanism to interact with them (Boxify reference monitor) has been closed.&lt;br /&gt;
&lt;br /&gt;
===Paper: Android Permissions Remystified===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
•	Applications access private information more than the user expects&lt;br /&gt;
•	Appls should stay within context, ex. A map application can have the user’s location, but only when the user is actively using it, not when it is in the background.&lt;br /&gt;
•	Bad practice to just ask user for permission once on install/first run, contexts change.&lt;br /&gt;
•	Some students found it surprising that 30% of the people from the user survery didn’t care how their info was used/misused&lt;br /&gt;
•	One student expressed views that user studies were not trustworthy and that they refused to look a tthem.&lt;br /&gt;
•	Why are user studies important?&lt;br /&gt;
o	Security Engineers keep building tools users cannot use (or cannot use properly)&lt;br /&gt;
o	They are a “club” to hit people with when they think they know the user’s needs better than the user does&lt;br /&gt;
o	Studies used to prove that “X” is a good/bad idea&lt;br /&gt;
o	Used to backup claims / results&lt;br /&gt;
•	Do application developers want suers to properly understand how permissions work?&lt;br /&gt;
o	No. They make more add revenue in the current system.&lt;br /&gt;
&lt;br /&gt;
===Anil: &amp;quot;Where the research is&amp;quot;===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
  • Basing a security system on reference monitors or raw cryptography is silly&lt;br /&gt;
    o The concept of reference monitors is broken&lt;br /&gt;
    o Cryptography is “fragile”&lt;br /&gt;
  • Idea of diverse implementations so that 1 threat or attack does not compromise all systems&lt;br /&gt;
    o Proof of Concept: Living Systems&lt;br /&gt;
  • Computers CANNOT blindly trust anyone.&lt;br /&gt;
  • Security is a trade off with functionality&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20924</id>
		<title>SystemsSec 2016W Lecture 23</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20924"/>
		<updated>2016-04-06T04:28:02Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: /* Midterm Discussion */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Topics and Readings==&lt;br /&gt;
*Boxify&lt;br /&gt;
**Michael Backes et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/backes Boxify: Full-fledged App Sandboxing for Stock Android]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
*Android Permissions&lt;br /&gt;
**Primal Wijesekera et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/wijesekera Android Permissions Remystified: A Field Study on Contextual Integrity]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
&lt;br /&gt;
==Notes==&lt;br /&gt;
&lt;br /&gt;
===Midterm Discussion===&lt;br /&gt;
  • Midterms almost all marked&lt;br /&gt;
  • Midterms will be returned on Thursday (April 7th), on average people did badly, we will discuss them in Thursday’s class&lt;br /&gt;
  • Question 1 was answered best overall, Anil had issues believing people had actually used the system before when they failed to supply enough detail&lt;br /&gt;
  • Question 2, most people just did not address all aspects of the question. Or argued for things that just were not true.&lt;br /&gt;
      o	Ex. Very few OS are verified, but lots of people claimed they were.&lt;br /&gt;
  • Question 3 also had several problems, he was extremely lenient with what qualified as a system (nowhere did the question say it had to be a computer system)&lt;br /&gt;
  • Example System: A Man carrying a suitcase full of cash&lt;br /&gt;
      o	Threat #1: Someone will steal the case&lt;br /&gt;
        -  Defense:  Get a bodyguard&lt;br /&gt;
            • Vulnerability: Guard could be bribed or could abandon you&lt;br /&gt;
      o	Threat #2: Hyperinflation reduces value of case contents to nothing&lt;br /&gt;
        -  Defense: Banks/Mints&lt;br /&gt;
            • Vulnerability: Currency minting plates get stolen&lt;br /&gt;
  • General Comment: FOLLOW THE FULL INSTRUCTIONS, BE SPECIFIC.&lt;br /&gt;
  • Concerns of time pressure leading to Anil thinking of 4 questions for the final&lt;br /&gt;
  • Anil tried to figure out exactly what problems people had with midterm&lt;br /&gt;
  • People cited time pressure, not knowing what to study, and having issues writing “essays”&lt;br /&gt;
  • Anil pointed out that for Q2 in particular, the textbook covered the elements of a secure OS at the end of every chapter, with specific examples.&lt;br /&gt;
  • Anil reinforced the idea that PEOPLE NEED TO READ THE QUESTION, AND ANSWER EVERY COMPONENT.&lt;br /&gt;
  • What are we supposed to get out of the papers?&lt;br /&gt;
    o Learn the patterns in the papers&lt;br /&gt;
    o If a paper says “x” is secure, you should be asking what the threat model is, where is the proof the attack/defense works like it is explained, how do you know?&lt;br /&gt;
    o Trying to learn the proper way to think about security related problems&lt;br /&gt;
  • Example question related to papers (something similar to this is likely to be present on the exam):&lt;br /&gt;
    o For an attack paper, discuss x, y, z.&lt;br /&gt;
    o For a defense paper, discuss x,y,z.&lt;br /&gt;
  • People complained about vagueness and lack of structure, Anil reminded everyone that after you graduate, very few things are well defined or structured.&lt;br /&gt;
  • Ask yourself what you have learned, and use it to answer the exam questions.&lt;br /&gt;
  • Your success in learning is shown best via critical thinking.&lt;br /&gt;
  • PLEASE READ THE QUESTIONS FULLY!!!&lt;br /&gt;
    o And review your answer to make sure you address every single thing he asked.&lt;br /&gt;
&lt;br /&gt;
===Paper: Boxify===&lt;br /&gt;
&lt;br /&gt;
  • Sandboxes applications&lt;br /&gt;
  • It builds a reference monitor for individual applications&lt;br /&gt;
    o This makes up for issues in the base Android monitor&lt;br /&gt;
  • Paper discusses OS and Application Modifications:&lt;br /&gt;
    o OS Modifications:&lt;br /&gt;
      - Requires flashing the OS, not very accessible or easy for users&lt;br /&gt;
    o Application Modifications:&lt;br /&gt;
      - No boundary between reference monitor and application at base&lt;br /&gt;
        • Full mediation and tamp reproofing not possible&lt;br /&gt;
  • Uses a new(ish) Android mechanism for loading an application in a fully isolated process, then slowly implementing functionality/permissions using a reference monitor in a different process&lt;br /&gt;
    o This grants (mostly) full mediation (misses Kernel Interface) and a decent amount of tamper proofing&lt;br /&gt;
  • Android is based on system calls and intents&lt;br /&gt;
  • Boxify shows several hallmarks of secure OS architecture; the reference monitor implementing a security policy&lt;br /&gt;
  • Trusting only a small portion of the system to be secure, vs. trusting the entire system to be secure&lt;br /&gt;
  • Is the chosen strategy a good one?&lt;br /&gt;
    o Assuming Boxify is the very first app you install, sure.&lt;br /&gt;
  • Boxify re-implements the entire Android permission model&lt;br /&gt;
  • Ideally a reference monitor has little to no functionality&lt;br /&gt;
  • Boxify “fails” safely, if it is forced to close the instances in the sandbox “starve” and die, since they only mechanism to interact with them (Boxify reference monitor) has been closed.&lt;br /&gt;
&lt;br /&gt;
===Paper: Android Permissions Remystified===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
•	Applications access private information more than the user expects&lt;br /&gt;
•	Appls should stay within context, ex. A map application can have the user’s location, but only when the user is actively using it, not when it is in the background.&lt;br /&gt;
•	Bad practice to just ask user for permission once on install/first run, contexts change.&lt;br /&gt;
•	Some students found it surprising that 30% of the people from the user survery didn’t care how their info was used/misused&lt;br /&gt;
•	One student expressed views that user studies were not trustworthy and that they refused to look a tthem.&lt;br /&gt;
•	Why are user studies important?&lt;br /&gt;
o	Security Engineers keep building tools users cannot use (or cannot use properly)&lt;br /&gt;
o	They are a “club” to hit people with when they think they know the user’s needs better than the user does&lt;br /&gt;
o	Studies used to prove that “X” is a good/bad idea&lt;br /&gt;
o	Used to backup claims / results&lt;br /&gt;
•	Do application developers want suers to properly understand how permissions work?&lt;br /&gt;
o	No. They make more add revenue in the current system.&lt;br /&gt;
&lt;br /&gt;
===Anil: &amp;quot;Where the research is&amp;quot;===&lt;br /&gt;
  - Placeholder&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20923</id>
		<title>SystemsSec 2016W Lecture 23</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20923"/>
		<updated>2016-04-06T04:27:10Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: /* Midterm Discussion */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Topics and Readings==&lt;br /&gt;
*Boxify&lt;br /&gt;
**Michael Backes et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/backes Boxify: Full-fledged App Sandboxing for Stock Android]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
*Android Permissions&lt;br /&gt;
**Primal Wijesekera et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/wijesekera Android Permissions Remystified: A Field Study on Contextual Integrity]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
&lt;br /&gt;
==Notes==&lt;br /&gt;
&lt;br /&gt;
===Midterm Discussion===&lt;br /&gt;
  • Midterms almost all marked&lt;br /&gt;
  • Midterms will be returned on Thursday (April 7th), on average people did badly, we will discuss them in Thursday’s class&lt;br /&gt;
  • Question 1 was answered best overall, Anil had issues believing people had actually used the system before when they failed to supply enough detail&lt;br /&gt;
  • Question 2, most people just did not address all aspects of the question. Or argued for things that just were not true.&lt;br /&gt;
      o	Ex. Very few OS are verified, but lots of people claimed they were.&lt;br /&gt;
  • Question 3 also had several problems, he was extremely lenient with what qualified as a system (nowhere did the question say it had to be a computer system)&lt;br /&gt;
  • Example System: A Man carrying a suitcase full of cash&lt;br /&gt;
      o	Threat #1: Someone will steal the case&lt;br /&gt;
          Defense:  Get a bodyguard&lt;br /&gt;
            • Vulnerability: Guard could be bribed or could abandon you&lt;br /&gt;
      o	Threat #2: Hyperinflation reduces value of case contents to nothing&lt;br /&gt;
          Defense: Banks/Mints&lt;br /&gt;
            • Vulnerability: Currency minting plates get stolen&lt;br /&gt;
  • General Comment: FOLLOW THE FULL INSTRUCTIONS, BE SPECIFIC.&lt;br /&gt;
  • Concerns of time pressure leading to Anil thinking of 4 questions for the final&lt;br /&gt;
  • Anil tried to figure out exactly what problems people had with midterm&lt;br /&gt;
  • People cited time pressure, not knowing what to study, and having issues writing “essays”&lt;br /&gt;
  • Anil pointed out that for Q2 in particular, the textbook covered the elements of a secure OS at the end of every chapter, with specific examples.&lt;br /&gt;
  • Anil reinforced the idea that PEOPLE NEED TO READ THE QUESTION, AND ANSWER EVERY COMPONENT.&lt;br /&gt;
  • What are we supposed to get out of the papers?&lt;br /&gt;
    o Learn the patterns in the papers&lt;br /&gt;
    o If a paper says “x” is secure, you should be asking what the threat model is, where is the proof the attack/defense works like it is explained, how do you know?&lt;br /&gt;
    o Trying to learn the proper way to think about security related problems&lt;br /&gt;
  • Example question related to papers (something similar to this is likely to be present on the exam):&lt;br /&gt;
    o For an attack paper, discuss x, y, z.&lt;br /&gt;
    o For a defense paper, discuss x,y,z.&lt;br /&gt;
  • People complained about vagueness and lack of structure, Anil reminded everyone that after you graduate, very few things are well defined or structured.&lt;br /&gt;
  • Ask yourself what you have learned, and use it to answer the exam questions.&lt;br /&gt;
  • Your success in learning is shown best via critical thinking.&lt;br /&gt;
  • PLEASE READ THE QUESTIONS FULLY!!!&lt;br /&gt;
    o And review your answer to make sure you address every single thing he asked.&lt;br /&gt;
&lt;br /&gt;
===Paper: Boxify===&lt;br /&gt;
&lt;br /&gt;
  • Sandboxes applications&lt;br /&gt;
  • It builds a reference monitor for individual applications&lt;br /&gt;
    o This makes up for issues in the base Android monitor&lt;br /&gt;
  • Paper discusses OS and Application Modifications:&lt;br /&gt;
    o OS Modifications:&lt;br /&gt;
      - Requires flashing the OS, not very accessible or easy for users&lt;br /&gt;
    o Application Modifications:&lt;br /&gt;
      - No boundary between reference monitor and application at base&lt;br /&gt;
        • Full mediation and tamp reproofing not possible&lt;br /&gt;
  • Uses a new(ish) Android mechanism for loading an application in a fully isolated process, then slowly implementing functionality/permissions using a reference monitor in a different process&lt;br /&gt;
    o This grants (mostly) full mediation (misses Kernel Interface) and a decent amount of tamper proofing&lt;br /&gt;
  • Android is based on system calls and intents&lt;br /&gt;
  • Boxify shows several hallmarks of secure OS architecture; the reference monitor implementing a security policy&lt;br /&gt;
  • Trusting only a small portion of the system to be secure, vs. trusting the entire system to be secure&lt;br /&gt;
  • Is the chosen strategy a good one?&lt;br /&gt;
    o Assuming Boxify is the very first app you install, sure.&lt;br /&gt;
  • Boxify re-implements the entire Android permission model&lt;br /&gt;
  • Ideally a reference monitor has little to no functionality&lt;br /&gt;
  • Boxify “fails” safely, if it is forced to close the instances in the sandbox “starve” and die, since they only mechanism to interact with them (Boxify reference monitor) has been closed.&lt;br /&gt;
&lt;br /&gt;
===Paper: Android Permissions Remystified===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
•	Applications access private information more than the user expects&lt;br /&gt;
•	Appls should stay within context, ex. A map application can have the user’s location, but only when the user is actively using it, not when it is in the background.&lt;br /&gt;
•	Bad practice to just ask user for permission once on install/first run, contexts change.&lt;br /&gt;
•	Some students found it surprising that 30% of the people from the user survery didn’t care how their info was used/misused&lt;br /&gt;
•	One student expressed views that user studies were not trustworthy and that they refused to look a tthem.&lt;br /&gt;
•	Why are user studies important?&lt;br /&gt;
o	Security Engineers keep building tools users cannot use (or cannot use properly)&lt;br /&gt;
o	They are a “club” to hit people with when they think they know the user’s needs better than the user does&lt;br /&gt;
o	Studies used to prove that “X” is a good/bad idea&lt;br /&gt;
o	Used to backup claims / results&lt;br /&gt;
•	Do application developers want suers to properly understand how permissions work?&lt;br /&gt;
o	No. They make more add revenue in the current system.&lt;br /&gt;
&lt;br /&gt;
===Anil: &amp;quot;Where the research is&amp;quot;===&lt;br /&gt;
  - Placeholder&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20922</id>
		<title>SystemsSec 2016W Lecture 23</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20922"/>
		<updated>2016-04-06T04:25:46Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: /* Paper: Android Permissions Remystified */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Topics and Readings==&lt;br /&gt;
*Boxify&lt;br /&gt;
**Michael Backes et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/backes Boxify: Full-fledged App Sandboxing for Stock Android]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
*Android Permissions&lt;br /&gt;
**Primal Wijesekera et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/wijesekera Android Permissions Remystified: A Field Study on Contextual Integrity]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
&lt;br /&gt;
==Notes==&lt;br /&gt;
&lt;br /&gt;
===Midterm Discussion===&lt;br /&gt;
  • Midterms almost all marked&lt;br /&gt;
  • Midterms will be returned on Thursday (April 7th), on average people did badly, we will discuss them in Thursday’s class&lt;br /&gt;
  • Question 1 was answered best overall, Anil had issues believing people had actually used the system before when they failed to supply enough detail&lt;br /&gt;
  • Question 2, most people just did not address all aspects of the question. Or argued for things that just were not true.&lt;br /&gt;
      o	Ex. Very few OS are verified, but lots of people claimed they were.&lt;br /&gt;
  • Question 3 also had several problems, he was extremely lenient with what qualified as a system (nowhere did the question say it had to be a computer system)&lt;br /&gt;
  • Example System: A Man carrying a suitcase full of cash&lt;br /&gt;
      o	Threat #1: Someone will steal the case&lt;br /&gt;
          Defense:  Get a bodyguard&lt;br /&gt;
            • Vulnerability: Guard could be bribed or could abandon you&lt;br /&gt;
      o	Threat #2: Hyperinflation reduces value of case contents to nothing&lt;br /&gt;
          Defense: Banks/Mints&lt;br /&gt;
            • Vulnerability: Currency minting plates get stolen&lt;br /&gt;
  • General Comment: FOLLOW THE FULL INSTRUCTIONS, BE SPECIFIC.&lt;br /&gt;
  • Concerns of time pressure leading to Anil thinking of 4 questions for the final&lt;br /&gt;
&lt;br /&gt;
===Paper: Boxify===&lt;br /&gt;
&lt;br /&gt;
  • Sandboxes applications&lt;br /&gt;
  • It builds a reference monitor for individual applications&lt;br /&gt;
    o This makes up for issues in the base Android monitor&lt;br /&gt;
  • Paper discusses OS and Application Modifications:&lt;br /&gt;
    o OS Modifications:&lt;br /&gt;
      - Requires flashing the OS, not very accessible or easy for users&lt;br /&gt;
    o Application Modifications:&lt;br /&gt;
      - No boundary between reference monitor and application at base&lt;br /&gt;
        • Full mediation and tamp reproofing not possible&lt;br /&gt;
  • Uses a new(ish) Android mechanism for loading an application in a fully isolated process, then slowly implementing functionality/permissions using a reference monitor in a different process&lt;br /&gt;
    o This grants (mostly) full mediation (misses Kernel Interface) and a decent amount of tamper proofing&lt;br /&gt;
  • Android is based on system calls and intents&lt;br /&gt;
  • Boxify shows several hallmarks of secure OS architecture; the reference monitor implementing a security policy&lt;br /&gt;
  • Trusting only a small portion of the system to be secure, vs. trusting the entire system to be secure&lt;br /&gt;
  • Is the chosen strategy a good one?&lt;br /&gt;
    o Assuming Boxify is the very first app you install, sure.&lt;br /&gt;
  • Boxify re-implements the entire Android permission model&lt;br /&gt;
  • Ideally a reference monitor has little to no functionality&lt;br /&gt;
  • Boxify “fails” safely, if it is forced to close the instances in the sandbox “starve” and die, since they only mechanism to interact with them (Boxify reference monitor) has been closed.&lt;br /&gt;
&lt;br /&gt;
===Paper: Android Permissions Remystified===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
•	Applications access private information more than the user expects&lt;br /&gt;
•	Appls should stay within context, ex. A map application can have the user’s location, but only when the user is actively using it, not when it is in the background.&lt;br /&gt;
•	Bad practice to just ask user for permission once on install/first run, contexts change.&lt;br /&gt;
•	Some students found it surprising that 30% of the people from the user survery didn’t care how their info was used/misused&lt;br /&gt;
•	One student expressed views that user studies were not trustworthy and that they refused to look a tthem.&lt;br /&gt;
•	Why are user studies important?&lt;br /&gt;
o	Security Engineers keep building tools users cannot use (or cannot use properly)&lt;br /&gt;
o	They are a “club” to hit people with when they think they know the user’s needs better than the user does&lt;br /&gt;
o	Studies used to prove that “X” is a good/bad idea&lt;br /&gt;
o	Used to backup claims / results&lt;br /&gt;
•	Do application developers want suers to properly understand how permissions work?&lt;br /&gt;
o	No. They make more add revenue in the current system.&lt;br /&gt;
&lt;br /&gt;
===Anil: &amp;quot;Where the research is&amp;quot;===&lt;br /&gt;
  - Placeholder&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20921</id>
		<title>SystemsSec 2016W Lecture 23</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20921"/>
		<updated>2016-04-06T04:25:20Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: /* Paper: Boxify */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Topics and Readings==&lt;br /&gt;
*Boxify&lt;br /&gt;
**Michael Backes et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/backes Boxify: Full-fledged App Sandboxing for Stock Android]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
*Android Permissions&lt;br /&gt;
**Primal Wijesekera et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/wijesekera Android Permissions Remystified: A Field Study on Contextual Integrity]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
&lt;br /&gt;
==Notes==&lt;br /&gt;
&lt;br /&gt;
===Midterm Discussion===&lt;br /&gt;
  • Midterms almost all marked&lt;br /&gt;
  • Midterms will be returned on Thursday (April 7th), on average people did badly, we will discuss them in Thursday’s class&lt;br /&gt;
  • Question 1 was answered best overall, Anil had issues believing people had actually used the system before when they failed to supply enough detail&lt;br /&gt;
  • Question 2, most people just did not address all aspects of the question. Or argued for things that just were not true.&lt;br /&gt;
      o	Ex. Very few OS are verified, but lots of people claimed they were.&lt;br /&gt;
  • Question 3 also had several problems, he was extremely lenient with what qualified as a system (nowhere did the question say it had to be a computer system)&lt;br /&gt;
  • Example System: A Man carrying a suitcase full of cash&lt;br /&gt;
      o	Threat #1: Someone will steal the case&lt;br /&gt;
          Defense:  Get a bodyguard&lt;br /&gt;
            • Vulnerability: Guard could be bribed or could abandon you&lt;br /&gt;
      o	Threat #2: Hyperinflation reduces value of case contents to nothing&lt;br /&gt;
          Defense: Banks/Mints&lt;br /&gt;
            • Vulnerability: Currency minting plates get stolen&lt;br /&gt;
  • General Comment: FOLLOW THE FULL INSTRUCTIONS, BE SPECIFIC.&lt;br /&gt;
  • Concerns of time pressure leading to Anil thinking of 4 questions for the final&lt;br /&gt;
&lt;br /&gt;
===Paper: Boxify===&lt;br /&gt;
&lt;br /&gt;
  • Sandboxes applications&lt;br /&gt;
  • It builds a reference monitor for individual applications&lt;br /&gt;
    o This makes up for issues in the base Android monitor&lt;br /&gt;
  • Paper discusses OS and Application Modifications:&lt;br /&gt;
    o OS Modifications:&lt;br /&gt;
      - Requires flashing the OS, not very accessible or easy for users&lt;br /&gt;
    o Application Modifications:&lt;br /&gt;
      - No boundary between reference monitor and application at base&lt;br /&gt;
        • Full mediation and tamp reproofing not possible&lt;br /&gt;
  • Uses a new(ish) Android mechanism for loading an application in a fully isolated process, then slowly implementing functionality/permissions using a reference monitor in a different process&lt;br /&gt;
    o This grants (mostly) full mediation (misses Kernel Interface) and a decent amount of tamper proofing&lt;br /&gt;
  • Android is based on system calls and intents&lt;br /&gt;
  • Boxify shows several hallmarks of secure OS architecture; the reference monitor implementing a security policy&lt;br /&gt;
  • Trusting only a small portion of the system to be secure, vs. trusting the entire system to be secure&lt;br /&gt;
  • Is the chosen strategy a good one?&lt;br /&gt;
    o Assuming Boxify is the very first app you install, sure.&lt;br /&gt;
  • Boxify re-implements the entire Android permission model&lt;br /&gt;
  • Ideally a reference monitor has little to no functionality&lt;br /&gt;
  • Boxify “fails” safely, if it is forced to close the instances in the sandbox “starve” and die, since they only mechanism to interact with them (Boxify reference monitor) has been closed.&lt;br /&gt;
&lt;br /&gt;
===Paper: Android Permissions Remystified===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
&lt;br /&gt;
===Anil: &amp;quot;Where the research is&amp;quot;===&lt;br /&gt;
  - Placeholder&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20920</id>
		<title>SystemsSec 2016W Lecture 23</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20920"/>
		<updated>2016-04-06T04:22:06Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: /* Midterm Discussion */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Topics and Readings==&lt;br /&gt;
*Boxify&lt;br /&gt;
**Michael Backes et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/backes Boxify: Full-fledged App Sandboxing for Stock Android]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
*Android Permissions&lt;br /&gt;
**Primal Wijesekera et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/wijesekera Android Permissions Remystified: A Field Study on Contextual Integrity]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
&lt;br /&gt;
==Notes==&lt;br /&gt;
&lt;br /&gt;
===Midterm Discussion===&lt;br /&gt;
  • Midterms almost all marked&lt;br /&gt;
  • Midterms will be returned on Thursday (April 7th), on average people did badly, we will discuss them in Thursday’s class&lt;br /&gt;
  • Question 1 was answered best overall, Anil had issues believing people had actually used the system before when they failed to supply enough detail&lt;br /&gt;
  • Question 2, most people just did not address all aspects of the question. Or argued for things that just were not true.&lt;br /&gt;
      o	Ex. Very few OS are verified, but lots of people claimed they were.&lt;br /&gt;
  • Question 3 also had several problems, he was extremely lenient with what qualified as a system (nowhere did the question say it had to be a computer system)&lt;br /&gt;
  • Example System: A Man carrying a suitcase full of cash&lt;br /&gt;
      o	Threat #1: Someone will steal the case&lt;br /&gt;
          Defense:  Get a bodyguard&lt;br /&gt;
            • Vulnerability: Guard could be bribed or could abandon you&lt;br /&gt;
      o	Threat #2: Hyperinflation reduces value of case contents to nothing&lt;br /&gt;
          Defense: Banks/Mints&lt;br /&gt;
            • Vulnerability: Currency minting plates get stolen&lt;br /&gt;
  • General Comment: FOLLOW THE FULL INSTRUCTIONS, BE SPECIFIC.&lt;br /&gt;
  • Concerns of time pressure leading to Anil thinking of 4 questions for the final&lt;br /&gt;
&lt;br /&gt;
===Paper: Boxify===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
&lt;br /&gt;
===Paper: Android Permissions Remystified===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
&lt;br /&gt;
===Anil: &amp;quot;Where the research is&amp;quot;===&lt;br /&gt;
  - Placeholder&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20917</id>
		<title>SystemsSec 2016W Lecture 23</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20917"/>
		<updated>2016-04-05T22:20:25Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Topics and Readings==&lt;br /&gt;
*Boxify&lt;br /&gt;
**Michael Backes et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/backes Boxify: Full-fledged App Sandboxing for Stock Android]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
*Android Permissions&lt;br /&gt;
**Primal Wijesekera et al., &#039;&#039;[https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/wijesekera Android Permissions Remystified: A Field Study on Contextual Integrity]&#039;&#039; (USENIX Security 2015)&lt;br /&gt;
&lt;br /&gt;
==Notes==&lt;br /&gt;
&lt;br /&gt;
===Midterm Discussion===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
&lt;br /&gt;
===Paper: Boxify===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
&lt;br /&gt;
===Paper: Android Permissions Remystified===&lt;br /&gt;
  - Placeholder&lt;br /&gt;
&lt;br /&gt;
===Anil: &amp;quot;Where the research is&amp;quot;===&lt;br /&gt;
  - Placeholder&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=Computer_Systems_Security_(Winter_2016)&amp;diff=20916</id>
		<title>Computer Systems Security (Winter 2016)</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=Computer_Systems_Security_(Winter_2016)&amp;diff=20916"/>
		<updated>2016-04-05T22:09:30Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: /* Lectures and Exams */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Course Outline==&lt;br /&gt;
&lt;br /&gt;
[[Computer Systems Security: Winter 2016 Course Outline|Here]] is the course outline.&lt;br /&gt;
&lt;br /&gt;
==Hacking Opportunities==&lt;br /&gt;
&lt;br /&gt;
The [[SystemsSec 2016W Hacking Opportunities|Hacking Opportunities]] page lists potential hacking opportunities that you can attempt for your hacking journal.  If you attempt but do not successfully accomplish one of them, be sure to document what you tried.  As you learn more, you may come back to them and try again.&lt;br /&gt;
&lt;br /&gt;
==Resources==&lt;br /&gt;
&lt;br /&gt;
===Readings===&lt;br /&gt;
&lt;br /&gt;
* For the first part of the course we will be reading selections from Trent Jaeger&#039;s [http://www.morganclaypool.com/doi/abs/10.2200/S00126ED1V01Y200808SPT001 Operating Systems Security] textbook.  You can download the PDF [http://www.morganclaypool.com.proxy.library.carleton.ca/doi/abs/10.2200/S00126ED1V01Y200808SPT001 through Carleton&#039;s library].  In the reading assignments this text will be referred to as &amp;quot;Jaeger&amp;quot;.&lt;br /&gt;
* An excellent but dated text on browser security is Michal Zalewski&#039;s [https://code.google.com/p/browsersec/wiki/Main Browser Security Handbook].&lt;br /&gt;
&lt;br /&gt;
===Other Courses===&lt;br /&gt;
&lt;br /&gt;
* Dan Boneh ran an excellent course at Stanford in Spring 2015 on [https://crypto.stanford.edu/cs155/ Computer and Network Security].  This course has many interesting readings that we will not be covering.  Also, the assignments are very good sources for hacking opportunities.&lt;br /&gt;
* The assignments from the Winter 2015 run of COMP 4108 [https://ccsl.carleton.ca/~dmccarney/COMP4108/ are available].  They are a reasonable start for several hacking opportunities.&lt;br /&gt;
&lt;br /&gt;
==Lectures and Exams==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;table style=&amp;quot;width: 100%;&amp;quot; border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;4&amp;quot; cellspacing=&amp;quot;0&amp;quot;&amp;gt;&lt;br /&gt;
  &amp;lt;tr valign=&amp;quot;top&amp;quot;&amp;gt;&lt;br /&gt;
    &amp;lt;th&amp;gt;&lt;br /&gt;
    &amp;lt;p align=&amp;quot;left&amp;quot;&amp;gt;Date&amp;lt;/p&amp;gt;&lt;br /&gt;
    &amp;lt;/th&amp;gt;&lt;br /&gt;
    &amp;lt;th&amp;gt;&lt;br /&gt;
    &amp;lt;p align=&amp;quot;left&amp;quot;&amp;gt;Topic&amp;lt;/p&amp;gt;&lt;br /&gt;
    &amp;lt;/th&amp;gt;&lt;br /&gt;
    &amp;lt;th&amp;gt;&lt;br /&gt;
    &amp;lt;p align=&amp;quot;left&amp;quot;&amp;gt;Readings&amp;lt;/p&amp;gt;&lt;br /&gt;
    &amp;lt;/th&amp;gt;&lt;br /&gt;
  &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Jan. 7&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 1|Introduction]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Jaeger, Chapter 1 (Introduction)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Jan. 12&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 2|Access Control, Security Hacking 101]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Jaeger, Chapter 2 (Access Control Fundamentals)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Jan. 14&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 3|Multics, UNIX, and Windows]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Jaeger, Chapter 3 (Multics) and Chapter 4 (UNIX &amp;amp; Windows) &amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Jan. 19&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 4|Secure OSs, theory and practice]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Jaeger, Chapter 6 (Security Kernels) and Chapter 7 (Securing Commercial Operating Systems)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Jan. 21&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 5|LSM, SELinux, &amp;amp; Capabilities]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Jaeger, Chapter 9 (LSM &amp;amp; SELinux) and Chapter 10 (Secure Capability Systems)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Jan. 26&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 6|Secure Virtual Machines, Systems Assurance]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Jaeger, Chapter 11 (Secure Virtual Machine Systems) and Chapter 12 (System Assurance)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Jan. 28&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 7|Lecture 7]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Feb. 2&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 8|Lecture 8]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Feb. 4&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 9|Defensive Security Technologies / Hacking Opportunities]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Feb. 9&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 10|Security Research, Hashes, and Secure Protocols]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Feb. 11&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 11|Modeling a potential attack/ Midterm FAQ]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Feb. 23&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 12|Midterm Review]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Feb. 25&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Midterm (in class)&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Mar. 1&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 13|Buffer Overflow/Memory Corruption Attacks]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Aleph One (aka Elias Levy), [http://www.phrack.com/issues/49/14.html#article Smashing The Stack For Fun And Profit] (Phrack 49, 1996)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Mar. 3&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 14|Buffer Overflow/Memory Corruption Defenses]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&lt;br /&gt;
    &amp;lt;p&amp;gt;Wikipedia, [https://en.wikipedia.org/wiki/Buffer_overflow_protection Buffer Overflow Protection]&amp;lt;br&amp;gt;&lt;br /&gt;
       Crispin Cowan et al., [https://www.usenix.org/legacy/publications/library/proceedings/sec98/cowan.html StackGuard: Automatic Adaptive Detection and Prevention of Buffer-Overflow Attacks] (USENIX Security, 1998)&amp;lt;/p&amp;gt;&lt;br /&gt;
    &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Mar. 8&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 15|Bypassing ASLR and Buffer Overflow Exploits using return-into-libc]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Hovav Shacham et al., [http://dx.doi.org/10.1145/1030083.1030124 On the effectiveness of address-space randomization] (ACM CCS, 2004) [http://dl.acm.org.proxy.library.carleton.ca/ft_gateway.cfm?id=1030124&amp;amp;ftid=285463&amp;amp;dwn=1&amp;amp;CFID=588127386&amp;amp;CFTOKEN=74533951 (proxy)]&amp;lt;br&amp;gt;&lt;br /&gt;
           Hovav Shachem [http://dx.doi.org/10.1145/1315245.1315313 The geometry of innocent flesh on the bone: return-into-libc without function calls (on the x86)] (ACM CCS 2007) [http://dl.acm.org.proxy.library.carleton.ca/ft_gateway.cfm?id=1315313&amp;amp;ftid=476749&amp;amp;dwn=1&amp;amp;CFID=588127386&amp;amp;CFTOKEN=74533951 (proxy)]&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Mar. 10&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 16|Lecture 16]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Bellovin and Cheswick, [http://dx.doi.org/10.1109/35.312843 Network Firewalls] (IEEE Communications Magazine, 1994) [http://ieeexplore.ieee.org.proxy.library.carleton.ca/stamp/stamp.jsp?tp=&amp;amp;arnumber=312843 (proxy)]&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Mar. 15&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 17|Lecture 17]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Dingledine, Mathewson, and Syverson, [https://www.usenix.org/legacy/events/sec04/tech/dingledine.html Tor: The Second-Generation Onion Router] (USENIX Security 2004)&amp;lt;br&amp;gt;Albert Kwon et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/kwon Circuit Fingerprinting Attacks: Passive Deanonymization of Tor Hidden Services] (USENIX Security 2015)&amp;lt;br&amp;gt;(background)[https://www.torproject.org/about/overview.html.en Tor: Overview]&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Mar. 17&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 18|Lecture 18]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Blase Ur et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/ur Measuring Real-World Accuracies and Biases in Modeling Password Guessability] (USENIX Security 2015)&amp;lt;br&amp;gt;&lt;br /&gt;
Nikolaos Karapanos et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/karapanos Sound-Proof: Usable Two-Factor Authentication Based on Ambient Sound] (USENIX Security 2015)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Mar. 22&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 19|Lecture 19]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Giancarlo Pellegrino et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/pellegrino In the Compression Hornet’s Nest: A Security Study of Data Compression in Network Services] (USENIX Security 2015)&amp;lt;br&amp;gt;Ramya Jayaram Masti et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/masti Thermal Covert Channels on Multi-core Platforms] (USENIX Security 2015)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Mar. 24&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 20|DDoS and Pinning]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Seyed K. Fayaz et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/fayaz Bohatei: Flexible and Elastic DDoS Defense] (USENIX Security 2015)&amp;lt;br&amp;gt;Marten Oltrogge and Yasemin Acar, [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/oltrogge To Pin or Not to Pin—Helping App Developers Bullet Proof Their TLS Connections] (USENIX Security 2015)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Mar. 29&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 21|Lecture 21]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;David A. Ramos and Dawson Engler, [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/ramos Under-Constrained Symbolic Execution: Correctness Checking for Real Code] (USENIX Security 2015)&amp;lt;br&amp;gt;Nav Jagpal et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/jagpal Trends and Lessons from Three Years Fighting Malicious Extensions] (USENIX Security 2015)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Mar. 31&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 22|Cookie Integrity and XSSI]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Xiaofeng Zheng et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/zheng Cookies Lack Integrity: Real-World Implications] (USENIX Security 2015)&amp;lt;br&amp;gt;Sebastian Lekies et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/lekies The Unexpected Dangers of Dynamic JavaScript] (USENIX Security 2015)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Apr. 5&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 23|Boxify and Android Permissions]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;Michael Backes et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/backes Boxify: Full-fledged App Sandboxing for Stock Android] (USENIX Security 2015)&amp;lt;br&amp;gt;Primal Wijesekera et al., [https://www.usenix.org/conference/usenixsecurity15/technical-sessions/presentation/wijesekera Android Permissions Remystified: A Field Study on Contextual Integrity] (USENIX Security 2015)&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;April 7&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;[[SystemsSec 2016W Lecture 24|Lecture 24]]&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;April 18, 10 AM-12 PM&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Last-Minute Study Session&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
    &amp;lt;tr&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;April 19, 9 AM&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
      &amp;lt;td&amp;gt;&lt;br /&gt;
      &amp;lt;p&amp;gt;Final Exam&lt;br /&gt;
      &amp;lt;/p&amp;gt;&lt;br /&gt;
      &amp;lt;/td&amp;gt;&lt;br /&gt;
    &amp;lt;td&amp;gt;&amp;lt;p&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/td&amp;gt;&amp;lt;/tr&amp;gt;&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Lecture Notes Guidelines==&lt;br /&gt;
&lt;br /&gt;
Part of your participation mark is doing notes for at least one of the lectures.  Here are the guidelines for those notes.&lt;br /&gt;
&lt;br /&gt;
The class TA Borke (BorkeObadaObieh at cmail.carleton.ca) will be handling course notes.  Please contact her to schedule your class to take notes.&lt;br /&gt;
&lt;br /&gt;
Borke or Anil will set you up with an account on this wiki.  You&#039;ll enter your initial draft notes here and then work with Borke to make sure they are of sufficient quality.  This may require a few rounds of revisions; however, if you follow the guidelines below it shouldn&#039;t be too bad.&lt;br /&gt;
&lt;br /&gt;
You should plan on organizing your notes as follows:&lt;br /&gt;
* Organize them in at least the following sections: Topics &amp;amp; Readings and Notes.&lt;br /&gt;
* The Topics &amp;amp; Readings section lists the main topics covered in the class, e.g. &amp;quot;buffer overflows&amp;quot;.  Please use an unordered bulleted list (using *&#039;s in wiki markup).  In this section also list readings relevant to the lecture that were mentioned in class.&lt;br /&gt;
* Put your notes in the Notes section.&lt;br /&gt;
&lt;br /&gt;
Use (nested) lists if appropriate for the notes; however, please have some text that isn&#039;t bulleted.  Please try to make the notes even if you did not attend lecture; however, you don&#039;t need to cover every small bit of information that was covered.  In particular the notes do not need to include digressions into topics only tangentially related to the course.  Complete sentences are welcome but not required.&lt;br /&gt;
&lt;br /&gt;
==Security Reading Analysis Guidelines==&lt;br /&gt;
&lt;br /&gt;
A security reading analysis is a detailed analysis of a security research paper.  In it you analyze the key arguments of the paper and give your informed opinion.&lt;br /&gt;
&lt;br /&gt;
Most security papers can be classified as attack or defense papers.  You should analyze them differently.&lt;br /&gt;
&lt;br /&gt;
For attack papers:&lt;br /&gt;
* What systems are vulnerable to the attack?&lt;br /&gt;
* What is the nature of the vulnerability?&lt;br /&gt;
* What is the the exploit?  In particular, what is its technical core?&lt;br /&gt;
* How reproducible is the exploit?&lt;br /&gt;
* Are there likely to be many similar exploits, in the targeted system or other systems?&lt;br /&gt;
* How difficult will it be mitigate/fix the vulnerability in targeted systems?&lt;br /&gt;
&lt;br /&gt;
For defense papers:&lt;br /&gt;
* What is the security problem the paper addresses?  In what kind of threat model(s) does the problem exist?&lt;br /&gt;
* How significant is the problem?  Specifically, to what degree do existing solutions not work sufficiently well?&lt;br /&gt;
* What is the defense?  How does it work?&lt;br /&gt;
* To what degree will the defense potentially solve the targeted security problem?  In particular, how difficult will it be for attackers to adapt to this defense?&lt;br /&gt;
* What are the challenges facing deployment of the defense?  Are they likely to be overcome?&lt;br /&gt;
&lt;br /&gt;
For both kinds of papers, you should give your reaction by addressing questions like the following:&lt;br /&gt;
* Did you like the paper?&lt;br /&gt;
* Was it easy to understand, or was it hard to read?&lt;br /&gt;
* Did you learn much from the paper?&lt;br /&gt;
* How surprised were you by the result?&lt;br /&gt;
&lt;br /&gt;
Your analysis should not cover the above questions separately (this would tend to make for a very wordy analysis); instead, use these questions as a guide in writing a short essay (1-2 pages) on the paper in question.&lt;br /&gt;
&lt;br /&gt;
Each analysis will be graded out of 10 as follows:&lt;br /&gt;
* U: 3 for demonstrating understanding of the content (preferably without summarizing)&lt;br /&gt;
* T: 3 for technical analysis (does it work)&lt;br /&gt;
* C: 3 for contextual analysis (does it matter)&lt;br /&gt;
* V: 1 for your viewpoint&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
	<entry>
		<id>https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20915</id>
		<title>SystemsSec 2016W Lecture 23</title>
		<link rel="alternate" type="text/html" href="https://homeostasis.scs.carleton.ca/wiki/index.php?title=SystemsSec_2016W_Lecture_23&amp;diff=20915"/>
		<updated>2016-04-05T22:08:52Z</updated>

		<summary type="html">&lt;p&gt;Nicholas Laws: Created page with &amp;quot;placeholder&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;placeholder&lt;/div&gt;</summary>
		<author><name>Nicholas Laws</name></author>
	</entry>
</feed>