Follow

Reflecting on my reading today which I summarized (probably poorly) to you, something really strikes me.

Replacing JavaScript in browsers with John Ankorstrom's proposals would probably lead to huge performance benefits.

Not only would it allow us to use a more efficient AST for HTML, but the dirtying passes WebKit does would more become invariants!

I just had to write a braindump on this topic. Much of it's probably getting ahead of myself, but wow! I'm now wanting a browser implemented as I described it there!

Also had to suggest a PostJS web doesn't necessarily mean we'd need to give up colloborative editing.

pad.tildeverse.org/code/#/2/co

Should I try summarizing thoughts here?

Wow, john.ankarstrom.se/english/tex… was an inspiring read, and intercoolerjs.org/2016/01/18/r… as well.

Not exactly sure what it means in concrete terms, but making dynamic HTML more declarative is definitely something I can get behind.

@alcinnz

Not sure about the auto-submitted form.

Given the principle that all network connection are initialized after a conscious human interaction (eg a click), you need to have a totally different UI for controls that say a normal textarea and a text area interacting with the server on each key press.

@Shamar On the contrary I'd have to argue that interaction with a normal textarea can be *more* informed than with a button or link.

Because with those I can only communicate security information on hover before inconviencing surfers and leading to permission blindness. But with text areas I can communicate that information both on hover and focus.

@Shamar So I'm more worried about autodownloading updates to keep the page live. That more directly goes against the principles I laid out, and there's a lot of good and bad that comes from today's equivalent.

The best way I know of to seperate the good from the bad with that is to have permission I go out of my way to encourage not to bother granting.

@alcinnz

I'm not sure I understand what you mean.

I was trying to point out that anything that happens automatically, like auto upload or autodownload of data, is a security issue.

Now, while asking permissions is fine, we have to avoid the risk people develop a reflex to blindly grant them. And when the grant we should add and balance additional measures to
1. keep the user aware of the risks
2. reduce potential exploits to the bare minimum
I'd argue 1 is more important than 2.

@alcinnz

Ultimately I think that supports of " #Web Applications " should be left to dedicated software specifically designed for it.

The attempt to mix a #HyperText browser with a runtime for distributed application UIs is what led us here.

We shouldn't try to do the same but better.
We should give people a better tool to navigate HyperText securely.

An application like CryptPad (that I like) is out of scope. We could have locally bound elements like a markdown text area 1/

@alcinnz

.. and a preview.

We can also have nice fancy native widgets like a WYSIWYG editor or a sound mixer.

We can also establish clear communication protocols with external applications such as a file upload form input that send a certain url to an external editor and collect the modified version on its closure, like git does with the configured editor on commit.

But the need / opportunity to connect an external services should be minimized and ALWAYS a conscious decision.

@Shamar True. I think I learnt from writing that is while we may end up standardizing the various components of something like CryptPad as they'd be useful elsewhere, they'd probably not hook together correctly for that application. And that's fine, as you said it shouldn't be our focus.

And I assure you that figuring out how to keep surfers in conscious control of any networking without leading to permission blindness is always at the forefront of my mind in writing these.

@Shamar It'd probably be useful to document in more detail what the scope of these specs should be. And it'd be interesting to define a standardization process out of this principles document.

@alcinnz

I think we should first set up a document with few minimal and simple principles.

We could call it "Principles for a Better Web" or something like that, to express that we are looking for a synthesis between technical issues and ethical ones.

How to write them down?
I don't know... but it should be very few dense sentences carefully crafted to tackle the orthogonal aspects of the vision we have.

Than we could expand such document with wireframe examples.

@Shamar On the matter of permissions, I'm tempted to describe my techniques.

Ankorstrom suggests that upon a web surfer indicating they might want to perform such action (e.g. on hover and/or focus) they are told of anything concerning that might do like networking. Then when they tell them computer to do the action (e.g. on click) it goes ahead. This is my preference.

Though I am considering a second technique I think @Wolf480pl caught onto quickly...

@Shamar @Wolf480pl I like to think of it as adding a button to the browser chrome upon viewing certain pages for web surfers to seek out if they need it, but otherwise pages remain perfectly functional.

For instance if someone does manage to recreate CryptPad on our specs it'd have to work perfectly fine without permissions. Just you'd have to manually pull and push changes by clicking certain buttons on the page. And if that's inconvenient then you'd look to see if you can grant permission.

@Wolf480pl @alcinnz

The link hover tooltip is a nice idea, but it should only trigger if the link would really connect the network: if the link is local to the page or just flip the visibility of a div or goes to an already cached page that didn't expire, no tooltip should annoy the user.

As for cryptopad: really not the goal, but we could have an AES form target that encrypt the content of the form locally and let the user download the result locally. But a native widget

@Shamar @alcinnz yeah but you need to poll the server for updates, and that requires a permission

@Wolf480pl @alcinnz

Shit, how I missed that? :-D

In which case this should not be done in the same application that let you read a blog.

Such kind of applications shouldn't be runnable on a #Web #browser. OR it should have an Update button to explicitly update the contents.

@Shamar what @alcinnz is proposing, is that you have a "fetch updates" button as part of the website that fetches the latest content (this is consistent with what we've defined so far) AND some way for the website to say "could you please autoclick this button every 5 seconds if the user agrees?"

But the browser doesn't immediately ask the user for permission. Instead, it denies it by default, and has some list of "permissions you could grant to this website if you wanted".

@Shamar @alcinnz
Now, the user hits the update button 10 times, gets tired of it, looks at the permission list, and notices that the website offers auto-clicking of that button every 5 seconds. If the user wants auto-refreshes, the user grants that permissions.

Thanks @Wolf480pl I didn't understand his proposal.

Seems a good idea, @alcinnz

Are permission preserved between sessions?

@Shamar
>I didn't understand his proposal.

Of course you didn't.
Because you didn't read carefully.

Show more

@Shamar @Wolf480pl Certainly if I were implementing the browser it'd persist the permissions. That'd probably lead to more care when granting them.

As for a perhaps better example where this standard could be useful, think of an online election map. That'd be valuable information to have online, and the people reading it would want it to constantly show the latest tallies.

@alcinnz @Shamar
btw. shouldn't we support something better than polling?
Some way for the server to actively notify the browser of some event?
One-way websockets, maybe?

Show more

@alcinnz Do you have any link to John Ankorstrom's proposals? I'm REALLY interested.

@alcinnz Thanks!
I'll take a really deep look to your notes and what he said. I'm really interested on this.

@alcinnz great!
I can't really focus now but during these days I'll investigate a little on this.

I'm also preparing a talk about the web and how broken it is using also @Shamar 's bug reports and Brannon Dorsey's studies on browser-as-a-botnet (link below).

I think all this is really important and interesting. We need to keep a critical eye where nowadays devs seems to be JS-only brogrammers with coffee addiction, glasses and beard.

medium.com/@brannondorsey/brow

@ekaitz_zarraga @alcinnz

Please let me know when and if there's a streaming or recording!

I'd really like to see and share it.

Sign in to participate in the conversation
FLOSS.social

For people who care about, support, or build Free, Libre, and Open Source Software (FLOSS).