EvoSec 2025W Lecture 21

From Soma-notes

Readings

Discussion Questions

  • Which predictions were the most plausible? implausible?
  • What relevant advancements/developments (technical or otherwise) were missed in the timeline that could change the envisioned trajectory?
  • What can we do to avoid a blockcloud apocalypse?

Notes

Lecture 21
----------

G1
 - plausible: online misinformation tool, dead internet theory
 - implausible: cryptocurrency takeover, AI as the silver lining?
 - advancements missed: LLMs, breaking crypto
 - need to trust each other more, decentralize trust, IRL interactions

G2
 - implausible: separate IoT cloud services, online misinformation tools
    (stuff to manipulate algorithms, generate content)
 - plausible: everything on the cloud, storage->computation, apps
 - not clear large govts using blockchain
 - missed LLMs
 - expertise gap between law/policy and AI/modern computer tech
 - avoid BA through policy, but maybe not avoidable (go to the woods)
 - plausible that AI will do more legal stuff

G3
 - plausible: cloud providers became govts, blockchain shift
 - implausible: AI worms attacking blockchains
 - real trust still relies on people, not just systems
 - missed advancements: improvements in security tech, improvements to privacy
   preserving tech, federated learning
 - no easy technological solution to these problems, solution will be
   on the policy side, will be social


How plausible are widespread worms?
 - social, not technical factors

How to avoid
 - take evolution & trust seriously
 - educate policy makers
   - tell better stories about security
   - we need to tell ourselves better stories

- computers allow us to process, communicate, distribute information
  - allows for social connections to be scaled
    (bureaucracy to be scaled)


There's a reason computers were first adopted by large businesses & govts
 - had the need to process information to run their bureaucracies

what is a bureaucracy?
 - infrastructure for organizing the activities of people

the constraints of distributed computation also apply to groups of people

With the introduction of computers, we've made processes for organizing society
that are more computer-focused (and org-focused) rather than socially focused
 (satisfying needs of people)

online dating
 - it is the way it is, because that's what maximizes engagement
   - need lots of users having lots of engagement
   - no incentive for short interactions that lead to satisfying outcomes

it isn't just dating
 - we want scalable technical systems to maximize economic opportunity for
   the companies
    - not necessarily to the advantage of individuals or society

Consider search
 - over what data? the right data, or all data?
   - is it worth curating data?
   - what data is worth indexing?

Why does Google index AI-generated slop?
 - what would it take to avoid doing this?

crawling the web blindly made sense when the web was mostly legit pages
 - but if it is full of crap, that isn't a good strategy

But WHY is the web filled with crap?
 - because that content can be monetized
 - Google created this problem!

If there was more friction in having ads on pages or in getting pages indexed,
we wouldn't have the AI generated web page problem
 - would be too difficult

newspaper ads were never so easy to place
 - even classifieds

when a third party can disintermediate trust, and they have few incentives to make sure parties act in trustworthy ways...you get a lot of anti-social behavior