WebFund 2016W Lecture 24

From Soma-notes
Jump to navigation Jump to search

Video

The video for the lecture given on April 7, 2016 is now available.

Notes

In Class

Lecture 24
----------

Modern web pages are a mess because they draw content
(and code) from many sources
* Including that content is easy, you just make a few
  changes, typically saying to load some JavaScript

Is there any barrier between that external code
and your client-side code?
* not really
* there are technologies for separating content, and
* JavaScript provides some separation (e.g. lexical scoping)
* But everyone shares the DOM

So if you control any of the content in a page,
you can potentially change any of it
* Imagine censoring comments


If you want others to incorporate your stuff into their
pages, what do you offer?

Maybe offer some sort of API...but how?

Maybe just give a link to my animated GIFs?
 - but a link to which one?
 - random every time?

Why not just give them JSON + JavaScript?
* Just give JavaScript, it will load the JSON on its own

So when anyone talks about "Web Services"
* Make an API out of HTTP requests (GET, POST, ...)

Might be accessed by
* a web page from the developer
* 3rd party web page
* mobile apps
* any other program connected to the Internet

It doesn't matter to the server.
THIS is the key benefit of a "RESTful" API

Used to be 2 ways to do a "Web Service":
* REST (e.g., use regular HTTP GET and POST)
* SOAP


(at least) Two schools of thought for code talking to
code:
* make everything a function call
* do something weird for network communication

SOAP is just a way to do function calls over HTTP

How do you do a function call over HTTP?
 - prepare arguments
 - send arguments
 - invoke remote function
 - retrieve result
 - store result

What about
* types?
  - have to communicate that
* but think how complicated data structures can get
* Solution: XML Schemas for data

SOAP is an example of RPC: remote procedure calls
but over HTTP

RPC turns local code into network code, but
local code isn't ready to face the network
 * security
 * efficiency

So why SOAP?
 * the old ways of doing RPC were blocked by firewalls

There's a "winpopup" service in Windows
 * pops up a window
 * as requested by a remote host

REST came along when "web natives" (programmers)
realized they didn't need all that infrastructure

Easy to see new things as "mostly old things"
But really you need to see it as its own thing
to really master it

What is the web?

* Publishing
* e-commerce (mail order catalogs)
* Information access (fine-grained)
* Multimedia
  - realplayer => youtube
  - CDs and videos
* VoIP (telephony), telegrams


* Social media
  - gossip

* "Uberization"
  - use a mobile app to connect people for business
 
Some of the progress in web technologies today is to
make "platforms"

Platforms are really technology looking for a purpose

Weird thing about the web is it is a "platform" that
nobody controls
 * Good: allows innovation
 * Bad: evolutionary cruft

Programmers HATE evolutionary cruft
 - it doesn't make sense!
 - its a mess!


Many, many investors don't like the web as a platform
 - they want to control the platform

iOS as a platform
 * apple gets 30%

Google takes a cut from the Play store as well

Platforms are a play at a monopoly in a space

Two directions

Student Notes

Logistics

  • Final exam on the 12th, 2 hours long, with similar format to the midterm
  • Solution for Assignment #6 will be posted on Sunday, but can be submitted until the end of Saturday
  • A study session will be on the coming Monday on 11th at Steacie Building 103
    • No set agenda
    • Focus on exam-storage
    • Attend and study beforehand, come with questions
    • If you just sit and listen, better to just study by yourself for 2 hours
    • If good questions are asked, will stick around around 10-12

End of the Line

  • What did I leave out, what's up ahead?
  • Tuesday we talked about scalability, but today we'll talk about what you're likely to encounter now and what to expect in the near future.
  • Let's pick a modern website, and see how it compares to websites we've built.
    • There's a reader comment section. What's happening?
      • The comments are loaded in the background, so the server doesn't load them all at the start.
      • Let's look at the page source - it seems to have comments in it - no it's a list of stories, but I didn't expect it to be in the DOM.
      • Do we load pages all at once or in pieces? There are a lot of pieces and parts on the page, some images, JavaScript, GIFs, JSON.
    • A lot of the pieces being requested have nothing to do with the main site. Will your webpage ever look like this? Does ArsTechnica even create and publish all this content?
      • Where does this complexity come from? There's icons, banner ads, sponsored stories that are clickbait to gather click impressions.
      • Almost no work to include these - only a few JavaScript scripts to include and let them run to grab someone else's content.
      • e.g. code for share buttons comes from Twitter and other social media sites.
      • You can combine content from other people & create a big mess, because it's easy.
      • The embedded code does whatever it wants, and it's inherently not separate from your own content as they share the DOM.
      • Compromised code originating from ad servers can do almost anything. Let's say people are revealing Google's secrets. Instead of blocking connections, just have ad code check the DOM for "google" or "secrets" in the main content and erase them or block them off with ads
    • The Web is not intended to be this mishmash of content

Functionality

  • What if you want your stuff incorporated into other people's pages - how do they make use of your functionality?
    • APIs are interfaces to code that we haven't covered in this class, but we're on the web and not running on a machine directly
  • You could provide just images and give them links, but if something dynamic is needed with parameters and controls like a twitter feed, then it needs to be data-driven
    • Need JSON+JS to be loaded on the browser to do background GET/POSTs, and we've already seen this
  • Web services are a buzzword for API over HTTP requests: GET/POST depending on what's important, loaded by browsers or other clients subject to interpretation at the end of the wire.
    • Facebook has a connect API for apps to authenticate people on phones and when people log in to Facebook on Desktop browsers. How do they look different? They're the same API because it's a login specific to Facebook users. Wrapping/delivering code with HTTP means it can apply to everyone on the web through the remote service's API
  • REST is Representational State Transfer, a successor to Simple Object Access Protocol (SOAP). They both are a way to access functionality over HTTP.
  • Older inter-process communications are through function calls on specific network protocols, but that's an abomination because arguments need to be determined, formatted, sent, and retrieved. There's packaging and unpackaging involved, and data structures to parse.
    • SOAP's solution was to use XML as a successor to SGML, a language for describing data rather than documents (which is what HTML does). It's a complicated way to send a dictionary + grammar book at once.
      • Remote procedure calls on the network has been around a long time, but it's a dumb idea because of lack of trust in unknown remote code. Security + Efficiency a problem on the web.
      • RPC bypasses firewalls that blocked other code like the Windows popup messaging functionality that was designed for local networks with considerations to convenience and not security.
      • Programs trying remote access, made use of HTTP to make things simple. We know REST has stopped trying to describe the whole universe inside schemas and just use GET/POSTs. We've already built websites under the same networked architecture.
  • Humans try relate new concepts with the familiar, describing the first cars as horseless carriages.
    • The internet is due for change and we don't know who's going to start it, what direction it's going, and what form it's going to take on in the future.
      • The functionality of displaying documents on the web supplants published books, e-commerce to replace mail-order catalogues, multimedia to consolidate cassettes and cds, and telephony to communicate better than telegrams.
      • However, social media has no pre-internet equivalent to imitate except gossiping. It is a new format that evolved from webpages enabling comments that are similar to academic annotations.
      • What's next? New usage of the web include Uberization to bring people closer to businesses.
      • Uber and Facebook have a purpose. Is the "Internet of Things" useful for smart web-capable fridges and other appliances? It's more of a buzzword that nobody has made intensive use of, unlike how driverless cars can be conceptualized and implemented with a purpose.
    • Efforts are made to create "platforms", things to be built on for other services
      • What really is a platform? Technology looking for a purpose and/or users
      • The web is becoming the universal platform that nobody can control.
    • The free nature of the web enables innovation and evolution past its designed standards, however it is rife with legacy technology. Sometimes they're senseless, vestigial cruft that's no longer being maintained but still used out of convenience. They can be ignored without maintenance or scrapped and started over, but it's dangerous for programmers to assume they know how other people will use the code out of hubris.

Future Platforms

  • People that are investors in companies dislike the web, since they have no control over it.
  • iOS has a 30% of everything on iTunes, and so does Google and Microsoft over their respective app stores.
    • Microsoft is transitioning from a OS-oriented company to a service-oriented company (Business + Azure cloud services)
  • The web is an uncontrolled environment among other platforms have been acquired with monopolies.
    • It will be acquired and turned stale into a legacy platform or supersede all others to become the platform.
    • Developers can and should choose their own technology, to become involved in choosing what direction they want the web to go.
    • Why become involved in the development of code? Open-source developments have increased in prominence, with Linux based OS taking on greater marketshare over Windows. It has co-evolved with the web through many programmers to become a better foundation for building technologies.
    • We don't know what's coming next (augmented reality), but choosing the right platforms are the only way to develop infrastructure towards the next generation. The web is going to stick around, so good luck studying.