Pinned toot

Don't get me wrong, computers can absolutely help us regain our environmental efficiency. They just *aren't*.

Not as long as we're:
* constantly syncing everything to the cloud,
* expecting same-hour delivery,
* funding our clickbait via surveillance advertising,
* buying a new phone every year,
* using AIs because they're cool rather than useful,
* running bloated software & webpages,
* buying into "big data"
* etc

Computing is environmentally cheap, but it rapidly adds up!

Show thread
Pinned toot

Maybe it's just because this is where my interest lies, but reading a few takes on how to Fix The Web yesterday I really think a major issue are the discovery hueristics we use. Their incomprehensibility and shallowness promotes the bad and buries the good.

There's PLENTY of good links! Otherwise I'd be wanting to tear The Web down rather than just JS...

I created Odysseus to explore some partial solutions, but I'm keen to see others address the problem from a different angle! Links? Advice?

Kind reminder that internet was made to be decentralized

Show thread

This short film from 1965 makes me very nostalgic for home :( I've travelled that entire route by train, but in the opposite direction! πŸš‚

CW: racism against Indigenous and Chinese people. I skipped most of it at the beginning:

/ht @emma

I've decided to establish a Guppe group ( @fyre_exyt ) to start building a team around Fyre Exyt, a project I've been envisioning since I quit FB in 2006 (?) to create a detailed guide to escaping the datafarms. Ideally it would take the form of a website or app that would walk users through the process. Maybe also guides for installing apps, choosing hosts, and creating accounts on replacement services.


What if web pages had to pay me every time theirJavascript ran on my computer and put my CPU to 100%

I think that would be fair

@grainloom I'm not a huge Algorithms expert, but I basically acquired fluency with pointers over many years of working with data structures like "BSTs with parent pointers". So I mean "foundational" from a pedagogical perspective. How are we going to teach pointers after everybody in the world is using Rust?

Perhaps unrestricted pointers should go the way of slide rules πŸ™‚ After all, Mu disallows pointer arithmetic. Perhaps it's all subjective.

@awalvie @neauoire

Both corporate and government surveillance are complements, and are ultimately both manifestations of power and monopoly. One form has an economic locus, one a political locus. Either may be abused, and each feeds on and amplifies the other.

Government surveillance has vastly less capacity without commercial services, particularly those tracking:

Strong signalling via payments: commerce, credit and debit payments, ecommerce, Amazon, Visa and Mastercard
Location: mobile devices, tollpaying systems, facial recognition / biosignatures, license plate scanners
Interests: marketing databases, demographics, AI-based content analysis, magazine and newspaper subscrptions, television and video watch history, video rentals (among the very few privacy carve outs thanks to Soliciter Bob), and online reading / views / interactions data.
Social relationships: USPS postal covers, telecoms call records (dating to the 1980s), location data (corrolated with other individuals), email, social media friends and interactions

Corporations rely on powers of governments including:

Law, including writing legislaation
The buying, selling, renting, borrowing, installation, or removal of legislators, executives, judges, and regulators
Regulatory capture
Law enforcement

The line that a government can kill or jail you whilst a corporation cannot is a fable long belied by history. Corporations (or straight-up criminal gangs, where differentiable) can and do act directly, or bend the government where necessary. Adam Smith's notes on commercially operated garrisons, the history of the labour and environmental movements, oil and railroad companies, chemical industrry, and more, provide numerous examples.

Surveillance, censorship, propaganda, and manipulation are tools, features, and benefits accruing to the controlling actor of any information monopoly, regardless of how that actor is organised or aligned. Not all abuse the power, but as Lord Acton notes, the temptation is strong.

#surveillance #SurveillanceCapitalism #SurveillanceState #power #monopoly

Now I've published the last section to

After a break I think I'll discuss Haskell development tools, namely HSpec, file-embed, Cabal, Haddock Hackage, & Hoogle.

And then GHC itself!

More writings about how software works at

I'd like to pretend that I boycotted Black Friday as some form of protest against the rampant capitalism and consumerism, but if I'm being honest... all the deals this year were shit.

Finally tabs are becoming useful. It's now possible to open links in a new tab by right-clicking.

My new hobby: getting rid of as many #dotfiles from my home dir as possible, by either moving them to XDG base dirs (~/.config, etc.) or just removing them (e.g., conffiles for apps I haven't used in 10+ years). I'm doing 10 at a time. I should be done by 2030.

Upon the Parser Combinators implemented upon those Get/Look operators (where branches may be merged) it implements a rudimentary Haskell pull-based lexer.

The Prelude uses that parser combinators framework and lexer to implement the Read typeclass allowing you to parse them with a call to `read(s)Prec`. Don't worry, GHC will offer to generate an implementation for you, and you can stick to the simpler `String -> [(a,String)]` API.

`read`/`readMaybe`/`readEither`/etc calls `readPrec`.

6/6 Fin!

Show thread

The Prelude provides a `read :: Read a => String -> a` function making it trivial to parse any of it's types. It's not trivial for the Prelude though, it defines it's own parsing framework behind the scenes to implement this.

At it's simplest it defines a synonym for `String -> [(a,String)]` returning all possible parses. Or it may hide the character iterator from the parsing code by having it yield Get, Look, or possible Results before it's ultimate Fail or Final.


Show thread

While GHC is bundled with `containers` implementing a size-balanced Binary Search Tree, the Prelude just uses a linked-list of key-value pairs (Eq k => [(k,v)]) by providing the function:

lookup _key [] = Nothing
lookup key ((x,y):xys) | key == x = Just y
| otherwise = lookup key xys

(P.S. "otherwise" is part of the Prelude, not the parser)

I've previously mentioned arbitrary precision Integers implemented using LibGMP or pure Haskell. These are imported into the Prelude.


Show thread

@djsundog Sure there are limits to the parallelization due to the delays and meta-stability: one still has to make pipelines etc. But CPUs are simply inefficient, they're made to perform a function only once in a while, not continuously in a loop.

The Prelude defines linked lists which it uses for everything, via the code: data [] a = [] | a : [a]
GHC is *really good* at optimizing the processing of these, but not always the storage. That's what Text & Bytestring's for!

A Haskell String is a linked list of unicode codepoints (`[Char]`) with hardly any APIs of it's own. Char is defined amongst the other numeric types.

Text encodings are stripped away in the I/O APIs, providing bufferring, typeclasses, etc around C's <stdio.h>.


Show thread

how expensive would it be to get an FPGA dev board with enough resources to emulate any CPU released to date... :blobthonkang:​

I'll start with numbers, because it's the most seperate!

They could *in theory* be implemented in pure Haskell using "Church Numerals", but that would be incredibly wasteful! So instead it uses "primitive" functions & types GHC's backend will compile into specific opcodes.

Around these primitives boxed types are defined to allow for laziness. The Prelude declares math operators, & typeclasses for operator overloading. GHC's frontend has hardly no concept of numbers, it's all in the prelude!

Show thread

Over the past week(s?) I've been describing the Haskell dependencies I pulled in to write Rhapsode. (Though my C dependencies are the more distinctive aspect, see )

So now let me ask: How does this compare to the standard Prelude? A.k.a. the code I don't need to import? A.k.a. the naive (mostly) pure-Haskell approach?

Specifically I've discussed:
* Mappings
* Networking
* Numeric lists
* Text
* Parsing
* & parsing numbers


What's in a Parser Combinator?

A topic I've been describing a lot recently... And I might do so once more today!

Show more

For people who care about, support, or build Free, Libre, and Open Source Software (FLOSS).