Picking up the electronic music theme some more, Sean Booth from Autechre has done a couple of AMA sessions on Twitch recently which have covered some interesting stuff relating to our thing.
If you don’t know Autechre, they’ve been recording music and releasing albums since the early 90s. Their sound is pretty varied, sometimes you can dance to it, sometimes you can’t, sometimes it sounds harsh and difficult, other times lush and gorgeous. They’ve sold a surprising number of records, and they play to big crowds on their arena tours. They’re the sort of act that some of my friends discuss with reverence like classic jazz, whilst my other friends just take a load of acid and disappear into the experience.
This is relevent because since the late 90s most of their music has been composed as algorithms (frequently using software called MaxMSP). The notes and rhythmes are set up as data sequences with different inputs and variables, that then play out and evolve in unexpected ways. Just like your Facebook feed, but if 10,000 people were dancing to it rather than getting angry that trans people exist.
For most of their existence Autechre have been presented as some threatening cyber-nerd hackers, with photos of Rob and Sean lurking threateningly around various East London landmarks, being defensive and cagy about their lives and music-making processes. But now its 2022 and all of a sudden we can spend some time with Sean wearing some photochromic dad glasses, sat outside his traditional scandinavian home chatting happily about nerd shit.
I guess the section that really spoke to me was when he describes how he structures projects. Each song is effectively a self-contained algorithm (MaxMSP patch), with its own set of behaviours, inputs and data. Because they behave in unexpected ways every performance of the algorithm is unique - the same algorithm might create a totally different piece of music when recorded for an album compared to performing for thousands of sweaty ravers. With this in mind, they actually create different versions of each algorithm for the studio and for live performance. The studio versions have more recursive behaviours (the generated sounds, melodies and beats are then reused as data inputs to create new sounds, melodies and beats) whereas the live versions depend much more on human intervention from himself and Rob on stage.
The studio versions are great for research, you can record them playing all day, isolate the good sections, pull apart how they work and learn from the process. It doesn’t really matter if it doesn’t produce anything good for days on end as long as you learn from it. But as Rob explains, when performing live you’ve got a responsibility to your audience, and its important to be able to understand the mood and enjoyment your audience has, and to nudge the algorithm into giving everyone a good time.
(I guess a super crude analogy would be comparing watching your dog enjoy itself at the beach to performing a routine with it a dog show.)
It’s interesting to me because these are processes that have evolved from what works after decades of using this technology in high-pressure environments with immediate vocal feedback. Sure, most commercial software has a larger userbase than a festival audience, but the technologists and product managers on those commercial teams are also pretty insulated from the real-world effects of their software, whereas performers like Autechre really can’t ignore if their audience think their algorithms suck.
Even more interesting is the importance they place on having research versions of an algorithm for prototyping and experimentation in a controlled environment, in addition to more conservative responsible versions for live use. As these technologies become a core part of allaspectsofcontemporarylife it would be nice to think they were being created more responsibly than rave music is, but… somehow I doubt it.
Last summer FACT did an Against The Clock film with Rian Treanor and Mark Fell. But I’ve had a busy year which I guess is why I only found it recently.
If you haven’t seen it before, Against The Clock is a Youtube series where FACT challenge a musician to improvise creating a piece of music in 10 minutes, on camera. Despite the videos predominantly consisting of a person glaring at a laptop screen, they’re a fascinating body of work exploring how different artists use essentially same set of tools in dramatically different ways to express themselves.
But the Rian Treanor and Mark Fell film differs from that format. With multiple groups of collaborators they use custom built software to improvise a punishing rave track, remotely over the internet. The groups include a bunch of teenagers in a community art space, Fell sitting on a dale in the Peak District, a pair of young electronica musicians in London, and Treanor’s grandmother performing from her own home. And as the different groups collaborate on the music, its also being blasted out live in a Sheffield club with synchronised lighting and effects.
Honestly, its fucking great and you should give it a watch when you have a spare 15 min.
It’s also a really inspiring view of what a good metaverse experience could be.
Like, I’m sure we all understand by this point that the metaverse is a marketing term to get investors excited about a bunch of immersive realtime web technologies. Its sort of bullshit, but we’ve also all been using this stuff over the pandemic, and presumably we can all see how it’s going to be a bigger part of our lives in the future. We might as well talk about what we want from it and see if we can shape it.
And Treanor and Fell’s image of that future looks so much more fun than the vanilla corporate-core coming out of the big tech companies so far.
Back to Mark Fell, Urbanomic just published a book of his collected writing. It’s called Structure and Synthesis: The Anatomy of Practice, and although it’s about making experimental electronic music I found it resonated a lot with my experiences working in technology.
At the risk of oversimplifying,* his main argument across the book is that innovative uses of technology only come from actually using the stuff and learning its constraints and quirks through hands-on experience.
Having worked in tech for a while now, its just blatantly true. And Fell fleshes out the argument with a bunch of historical examples, philosophical context, discussion around the consequence this has on commissioning and funding new work, and the role of theory and strategy in post-rationalising and claiming credit for innovative new works rather than doing anything to create them.
All uses of a technology alter and define that technology. Any tool is subject to redefinition through its uses, and is dependent on its placement within wider social and cultural contexts; for example, my Dad’s use of a screwdriver to open a tin of paint, or my friend’s use of a shoelace to commit suicide.
He also goes on to explores the class, race and gender biases that lead to theoretical strategic practice being more highly valued than hands-on technical experience, despite the hands-on technical work being the activity that pushes things forward.
And I guess that’s whats missing from the big metaverse concepts we’ve seen so far; creative play.
It’s a great book, and very readable (ie, you can skim over the philosophy bits and it still makes sense, even while feeding a five month old baby).
You’ll learn how to work together as a group to explore computer patterns, electronic rhythms and unusual digital sounds using accessible music devices. The session will offer a fun, creative and relaxed space to explore playing and experimenting with sound. It’s open to families, children must be accompanied by an adult for the full session (apparently will be suitable for children age 3yrs and over).
Sounds fucking brilliant, I wish I could make it. You should go.
Nike just announced some new interesting new sneakers. Last year I wrote a bit about Adidas and Yeezy’s efforts in circular production for sneakers, ie how we can reduce the environmental cost of consumable goods. Since then a few sections of my Instagram discovered a shoe Nike put out called the Considered BB back in 2005.
The Considered BB were an early attempt at a sustainable version of Nike’s Dunk basketball shoe, made with an inside-out moccasin construction (no glue made them easy to repair and recycle) and all materials sourced within 200 miles of the factory they were made in.
I had a pair and they were great. Comfy and so hardwearing they lasted me for almost a decade.
As you could imagine there’s been a fair amount of speculating and conspiracy theories as to why the worlds biggest footwear manufacturer abandoned a sustainable product line that was so ahead of its time. It all gets a bit Who Killed The Electric Car, but anyway now we have these new ones:
They’re called the ISPA Link Axis, which is an awful name but you’ll probably never have to say it out loud so who cares I guess. They’re made of a small number of single-material parts that don’t use any glue, which means they can be easily disassembled for recycling.
One of the interesting aspects of the design is the modularity. Obviously it’s intended to make recycling easier, but I assume it also means you could replace individual parts of the shoe without having to replace the whole thing. Like replacing parts when they wear out. But presumably you could also switch to different style uppers to suit different outfits, or switch out a soft everyday sole for a grippy hiking friendly one, or stuff like that.
As Nike point out, it only really means anything if these product techniques are scaled across the whole range, so it’ll be fun seeing if and how they manage to implement these principles across other types of footwear.
Crate engines are replacement car engines sold separately from the rest of the car, so this is exciting for lots of reasons! It suddenly opens up whole swathes of older vehicles to becoming electric cars. Plus the Ford e-crate engine is high-powered (it powers the new electric Mustang) and only costs $4000, which makes it both fun and affordable. And converting your existing car to electric is even better for the environment than replacing it with an new electric car.
I mean, I just I love this stuff.
But beyond that, it’s really interested to see how the open-source mindset we’re accustomed to in software start is changing industrial product design. Big companies starting to thing about their products as objects with a lifespan that continues after they leave the store, with users whose needs might change over time.
These are modular, componentised objects designed to be iterated. Good for the user (who can repair and improve the things they own), good for designers and manufacturers (loads of new interesting spaces open up) and good for the planet.
Haven’t been writing much recently. There’s a lot been happening in the world, and not much of it to do with things like design and whatever.
Anyway, last week Yeezy announced their new red Foam Runner sneakers. As the blogs quickly pointed out, Kanye hasn’t released an all-red sneaker since his final (very rare, very expensive) product release with Nike - nicknamed the “Red Octobers”.
As I tweeted, a lot can happen in seven years.
And then, because I love a little plot twist, followed up with this comparison between the shoes:
Red Octobers (2014):
labor-intensive to make
complex international logistics required
Red Octobers (2021):
experimental futuristic aesthetic
single algae-eva blend material
manufactured in the USA
And yeah, despite all the other shit Kanye has done over the last few years, there’s some really interesting and progressive design decisions that have gone into the latest shoe. The experimental material choices hugely reduce the environmental impact of it, manufacturing them in the USA means their sales actually create contribute back in to the economy in their primarily market (rather than just piping money out of the region) and again, doing do reduces the environmental impact of shipping both raw materials and finished product around the planet.
All of these strategies are enabled by the form of the shoe, allowing it to be moulded from a single material, and reducing the manufacturing complexity so that it can actually be produced by less… ‘advanced’ western manufacturers.
And then the way the shoes embrace that material boldness rather than trying to hide it beneath a retro aesthetic? Fuckin A, man.
Adidas (who manufacture Yeezy sneakers) have been doing other interesting work in this space, especially around the environmental consequences of sportswear consumption. One thing that makes a product hard to recycle is when it is constructed from lots of different constituent parts, made from different materials. Each part needs to be separated, before each material is recycled in a different way. Depending on how the product has been constructed, even just separating the parts from each other cleanly can be impossible.
As a direct response to this problem, a couple years ago Adidas started experimenting with running shoes where every constituent part of the shoe is constructed from the same raw plastic material. Some of the plastic is spun into yarns that can be woven as fabric, more of the plastic is moulded into foam rubber for the cushioning, and so on. Applying this material research to every part of the shoe, when it reaches the end of its lifespan it can be recycled without needing to separate it into parts.
Then, the recycled plastic from the shoe can simply be used to make more shoes.
The process is known as circular production. The shoes are still at the beta stage, but Adidas sold hundreds of them, collected every pair at the end of their lifespan and succeeded in recycling them, so it sounds promising. Here’s a Wallpaper article about them from back when they were announced a couple years ago.
In a similar way to good service design, it’s all stuff that’s almost too obvious to even bother including in a strategy deck, but way too complicated to be achievable once you get to product implementation and marketing. It’s the result of a huge amount of technology research and iteration, tightly focused around a single, simple idea. They sell a pair of runnning shoes, and then when those shoes are worn out, they want to sell a replacement pair of running shoes. With the least possible environmental impact.
Over the years I’ve posted lots of stuff on here about how hard it is to make things simple (even making it one of the UK Government Design Principles) but outside of graphic and communication design there aren’t actually that many examples of teams really following through on it at scale. Some of the Ikea and Muji furniture ranges? Love them, but it’s all mature technologies and whatever - not exactly exciting. This shit? This is the future.
If you like to keep track of these things, it’s interesting that these Adidas/Yeezy products - despite being rigidly simple from a product perspective - don’t have particularly minimal aesthetics. If anything they fit into a brutalist canon, being transparent about their construction methods and the artificiality of their materials.
There must be something in the air. While I was writing this, both Ella and Jen sent me new articles from relatively mainstream UK publications about the benefits of simplifying, subtracting and refining as a creative process. Love to see it! But more than anything I’ve had this passage from A. G. Cook’s tribute to Sophie floating around in my head:
Part of my education consisted of just watching her build and re-build her music. I remember hearing a version of HARD that sounded great to me, but she felt it wasn’t gelling. Instead of tweaking or ‘fixing’ it in any way, she simply started again, remaking every sound, every drum, every synth part from scratch. I thought she had lost it at first, but I realised that she saw each component with such clarity that it was simply easier for her to remake everything than to force parts that didn’t truly fit together.
Since we published the Command Line Interface Guidelines we’ve had a few requests on Discord and Twitter from people asking us to open source the Hugo theme so they can use it for their own projects.
Unfortunately the font licensing made this tricky. Plus I’m always uneasy about taking designs intended for a specific audience and just applying them to something else without consideration. So instead I forked CLIG and created a generic unbranded version of the Hugo theme.
It’s called Hugo Long Read, and as you’d expect it’s a simple open-source Hugo theme for lengthly single page websites. There is no support for multi-page websites, and the navigation you can see in the screenshots below is a table of contents generated from the page headers. So if you are looking for a more traditional website theme with support for navigation and index pages, this theme isn’t for you!
But, like the Command Line Interface Guidelines it has a bunch of viewport specific CSS typography rules to optimise long-form text for different devices. The design uses three beautiful typefaces – IBM Plex Serif, IBM Plex Mono and TeX Gyre Heros – that are all provided under Open Font License by their creators specifically for open source projects like this.
And of course it still has dark mode, if you’re into that sort of thing.
A few weeks ago I posted on here about my work with Replicate. As they were building it, Ben and Andreas were looking around for advice on best practice for designing command-line interactions. In contrast to best practice advice for more visible interfaces – which have had extensive books, articles and principles since Apple’s Human Interface Guidelines in 1987 – they couldn’t find much about CLIs. So after finding a few other people who were thinking about this area, Ben, Aanand, Carl and Eva wrote a guide for designing command-line interfaces.
It evolved into a very long web page. On one of our zoom calls about Replicate Ben mentioned it and asked if I wanted to get involved. Having done a little work with non-graphical interfaces at GDS I think they’re both really interesting and relatively undocumented, obviously I said yes.
Due to the long, text-heavy nature of the page it’s optimised for readibility across different device sizes. Most of the design complexity comes from a stack of CSS font-size rules optimising the typography for differing screen dimensions. We took an aesthetic lead from traditional unix man pages, while in dark mode the pages takes on more of a terminal flavour.
Despite the guidelines themselves being pretty monochrome and stark, I threw together a more colourful social media card to help the link stand out on the timeline.
We ended up shipping The Command Line Interface Guidelines last friday after someone stumbled on the preview version out in the wild and tweeted about it lol. To all our surprise it got fifteen-thousand views in the first few hours, and spent three days on the front page of Hacker News!
One of the many weird things about this year has been having extended breaks from spending time in the sort of businesses I would otherwise visit on a daily basis. I’m thinking mostly of hipster coffee shops and bars. Coming back to these spaces and seeing them with fresh eyes, it’s almost funny how similar and yet considered the decor choices are.
I know there’s been plenty written about the instagramification of interior design, but having lived in east London and Brighton for almost two decades I guess it didn’t really mean much to me, because these places are largely responsible for the style rather than copying it. The proliferation has mostly been intended to make people like me (and you) feel at home wherever in the world we might be. But spending time away from them has sort of broken the spell.
This year I’ve had a little photographic project where whenever one of these interior design features catches my eye, I find a commericial stock photograph of a similar business with the same interior design feature and use Photoshop’s algorithmic fill to replace every unique feature of the photograph with generic ones.
I’ve made a little photo book of some of the images, it’s called Generic Generated Gentrifiers. It’s a simple 14 pages, 21x28cm, black and white giclee on heavy gloss paper. At some point I’ll set up a shopify for stuff like this, but send me a DM if you want one (I’m thinking like 7 pounds including UK postage).
Over the last couple of months I’ve been helping Andreas and Ben with their new project, Replicate.
If you weren’t working in software before widespread, free-to-access version control it’s hard to express how differently things worked then. So much of the extreme programming stuff we take for granted like remote collaboration, iterative design, test driven development, a/b testing, public contribution to open-source, were almost impossible due to how time-consuming and unwieldy it was to explore, analyse and experiment with code over time. Software was chaos, and having only worked in it for a few years before Git took off, I’ll never take version control for granted!
When Andreas and Ben chatted to me about their research, and explained that version control isn’t freely available for machine learning I was surprised – my whole understanding of the industry needed recalibrating for a minute. You can sort of use Git, but making commits for every part of the decision tree is clunky. And there are some super-expensive proprietry enterprise packages available, but there isn’t a standard like Git which everyone can access. And without everybody being able to access the tool, there’s no way for you encourage everyone to adopt modern development practices the tool enables. Instead we end up with these AI black box systems, indistinguishable from magic.
As an aside – I’m not a tech ethics specialist but it’s definitely been part of my responsibilities in previous roles to keep on top of the debates and controversies around machine learning and AI. I’ve read a lot of the books, blog posts and criticisms of these technologies, been to a few meetups and stuff. In all of that, I’ve never heard anyone mention that these technologies don’t have the basic tooling we would expect in order for them to operate in the way we expect modern, transparent software products to behave. Instead it’s all conspiracy theories or demands for speculative features that can’t be built. IMO there’s a broader lesson about the basic technical literacy of the criticism and debates around our industry here that I’m going be unpicking for a while.
Back to the point, Replicate launched into public beta yesterday. It’s a simple, lightweight and open source command line tool that records your machine learning experiments. That makes it easier to poke around in your experiments in order to understand what they actually did. You can run it locally or on cloud services, it’s compatible with pretty much anything, and you can analyse the results in a notebook.
H&M are trialling a new recycling system called Looop in one of its Drottninggatan stores in Stockholm. It cleans and shreds old clothing into its base fibers before spinning these into new yarn, then making new clothes from the yarn. Looks amazing.
Particularly love the potential for brand mischief. It’s only a matter of time before Demna starts rolling them out and charging his customers $250 to recycle any old tshirt into an authentic Balenciaga one.
One of the weirdest parts of learning CSS is wrapping your head around the cascade. We abstract and use frameworks and all sorts of stuff trying to control it, to scope our styles and make our webpages look exactly how we want them to. But what about doing the opposite? Instead of going at the cascade like a challenge that needs to be overcome, could be fun to engage with it as a creative programming tool.
It’s something I’ve been interested infor a while. The relative scaling typography system I created for the responsive BBC News website evolved from those ideas, using environment variables (the viewport dimensions) to cascade sizing variables through the interface. Pretty limited, but it was fun.
Didn’t really think about it after 2012 when I started working in government. It’s not remotely appropriate in a public information context, and tbh government has plenty of big problems to keep my brain occupied. Anyway, last month I picked up the Lars Muller reprint of Karl Gerstner Designing Programmes and it started me thinking about some of this stuff again.
Since 2012 we’ve gained a lot of new CSS properties that we’re barely scratching the surface. VH and VW units have codified relative scaling font sizes as a browser default, while CSS grid, variables and widespread SVG support open up huge possibilities. So I spent an afternoon last week having a quick play with something. Exploring the grain, as Frank would call it.
OK, so what the fuck is this?
It’s a grid of screenshots I took of a basic progammatic art direction exercise. The programme consists of two parts
A complete system of modular layout and style rules
A process of deciding which rules to apply to your HTML elements
Because it’s just a quick proof of concept this uses random numbers generated on pageload as inputs. Again, fun, but probably not super useful beyond, idk, generating a couple thousand advert variants or social media images (or maybe creating a run of unique posters or tshirt graphics?). But I think it could be really cool to use environment variables (device info, time of day, weather) or content information (seriousness, availability of assets like photographs or video) as inputs instead.
And also! I’m using a single color set and some simple transforms to manipulate the type and crudely bend it to the grid, but with time and expertise you could do some fucking wonderful things with generated color pallettes and filters, and with expressive variable fonts.
Why would I ever want to do this?
My Facebook feed has totally different content to yours. Plenty of newspaper homepages adapt themselves to your reading habits too. The political adverts I see on my devices contain different promises to the ones you see on your device. And yet you wouldn’t know that to look at them — the content looks identical. It all looks very Dieter Rams, but at the same time obscuring the differences in content doesn’t feel very honest or understandable. Maybe there’s an opportunity to use programmatic editorial design to reflect the values of our programmatically-generated content.
Obviously this is not appropriate for UI design.
Obviously, you won’t try this on important, standardised interactions like logins and card payment processes.
We all know that would be bad.
And yet. So much of the story of CSS so far, from Zen Garden and the Holy Grail through to Bootstrap and React has focused on taming the browser to a few standardised interactions and layouts, to which you can apply or remove “branding” without much consequence. Great for standardised interactions, app interfaces and plenty of other things, but a lot of digital culture doesn’t need (or benefit) from being optimised so much. Sometimes you just want to enjoy reading a thing, you know?
If my New York Times homepage is all lifestyle and celebrity news, but yours is completely finance and politics — should they really look the same? It’s easier to make the website that way, and the brand should be consistant yes, but print processes have adapted over the years to allow art directors to make the reading experience in the printed newpaper reflect the content. Maybe it would be cool if my NYT homepage looked like the magazine section with big photos and expressive type, while yours was all densely-packed columns to reflect your heavy intellectual interests.
One program library with different inputs.
Theres a lot of prior art for designers thinking this way. Grid layouts are like a century old, Designing Programmes (the book I mentioned at the beginning of this post) was first published in 1964, Bruce Mau was creating generative logos when you were still using Myspace and most of your favourite videogames use these programmatic ideas to generate their soundtracks.
Maybe there’s something in it. If there is, it probably shouldn’t be as aggressive as this example. But it’s really fun whatever.