The web in decay is the web by design

Reading Time: 25 minutes

I watched a city burn down before my eyes on a flickering feed. Each video perfectly designed to twist and sicken me, relentless and damning despite the amount of distance between me and Marawi. On an early Sunday evening in 2020, I felt the cries of my country over a cataclysm that covered the sky in black, and every day, see at the sidelines new and resurfaced videos of death and destruction passing through screens and filters. A death ad infinitum, preserved forever as spectacle for the world. Technology preserves these moments, gift and curse of eternity laid in human hands.

The internet revolution had brought us both the curse of permanence and an inevitable brainwashing. With the size of the internet doubling every five years and feeds favoring the most masochistic, soulcrushing content that has given us doomscrolling in our vernacular, we live in an age of despair. The worldwide internet exists only as a promise; a medium that has seemingly rendered distance negligible (a hyperlink theoretically brings you nearly anywhere in a single click, more detailed on the Critical Atlas of the Internet) still has us contained to the closest spectacle if we aren’t gated already. We know so much yet so little.
Somewhere in the western world, the decision-makers dismiss the internet plight, blind, or even untouched, to the media and messages that tell these stories. When Twitter, Facebook, Amazon, Shopify, and dozens of other companies followed in taking action to cease domestic terrorism in the United States, it was not only as a result to five (and counting) lost lives. In reality, it was a drawn realization of the millions of lives that have been thrown away by the internet age––many of whom existing only as numbers, names left in scanned lists to decay. Footnotes to life where the living are pummeled around to ignore their being.
The most painful part of this is the forgetting. This may be entirely paradoxical, but this is the internet I breathe in and live in––and if the internet has given me anything, it is the knowing that I am exceedingly unspecial and my experiences never come alone. The most mundane things become permanent on the world wide web: the million dollar webpage and all its decayed pixels and spam redirects, screenshots of a tweet that angered a Kaworu Nagisa profile picture left to follow OP to their next four usernames, an image shared and rehosted over and over. Yet, our links are rotting and information we treasure lost. Little web experiments I made over my high school years purged from s2005016.students.dlszobel.edu.ph without any foreseeable warning, text files and notes on disjointed services that toss away data, overly-cited .edu class pages and death notes.

The average life expectancy of a webpage is a hundred days. This average is likely to be far lower for materials that live on social media. Now, half of hyperlinks cited in United States Supreme Courts decisions are broken; for scientific journals, this number climbs up to 66% or so. Two decades ago, a website might have been up for two years and seven months.


The Attention Game

For years, startups have recognized this information overload and made many a revelation on the best way to catch up to their existence – a ticking bomb. This is where the “build fast and break things” mentality thrives, right next to the the “execution is everything” mantra. Indeed, we live in a complex age where the richest men are victors because of luck and timing––far more incalculable than anything else with the only consistent constant being persistence.
The idea here is that defeating the beast of link rot inevitably wins on attention and status. (You win against materials that simply no longer exist, of course.) Attention is a zero-sum game. Content has never been so abundant and impossible to dig through––we would sooner die than be able to experience a moment’s worth of media uploaded unto the net. Then, what one wins, another is denied. Humanity’s own limited cognitive capacity has naturally built up constraint and limitation against a once-seemingly infinite world where distance and proximity meant nearly nothing. Delivery of information from one point to another happening instantaneously is near impossible when we exist in digital echo chambers; you assume you could care about the Philippine drug war, the reality faced by Uyghur Muslims, or really––even the worldwide solidarity and outrage towards the Black Lives Matter movement. Yet, they appear as passing materials and items that are waved in front of you (if ever) and dissipate as quickly.

It’s a shame that many of the world’s most important problems are also seemingly vapid, and that man is inherently imperfect at the act of prioritization, and absolutely terrible at judging importance.

Or, everything is important and worth fighting for––but nothing is life-changing. And accepting these two seemingly conflicting truisms is key to overcoming the inalienable beast of a web we’ve grown into. Like how prioritization demands internal value alongside urgency in time, we come to accept that even the most mundane, original pieces of content on the internet are worth saving. I don’t mean spam bots (or maybe they still do to understand their behavior?) and what’s lying around in my Akismet filter, but any item that one puts out on the internet should be ours to save. Evermore if they expect it to be. Our data alone is invaluable, far too cheap for how easy we give it away now. Our memories become priceless, and the internet’s promise is the closest thing we have.
Listen to me. I’m twenty and have known nothing before the internet, and people younger than me who grew up even more hyperdigitalized (I came when the emerging world was getting laptops as a thing, younger brother coming in when iPads were an academic expectation and thus, everywhere) have the art of everything we said existing here and now. I’m scraping together photos from senior year of high school––something that barely happened three years ago. I feel like these experiences sometimes exist no more than as a physical vestige. Yes––they happened over Skype, the shitty Facebook Messenger, Xanga, my first website’s cbox.ws––but my personal data on the individual level means nothing. I can’t retrieve that, and I’ve tried. Forum messages wiped, I desperately try to recover the last pieces of writing I made still available on the Wayback machine, the photos I edited and put together of my friends before I deleted it in a flash of fury, deleted accounts wiping away time. The algorithm will never know that the things I most enjoy I refuse to interact with. Human behavior is jagged and cannot be contained.

It might even be a minor form of gaslighting. The attention economy handpicks what survives, and no one remembers how loss comes so easy. It was a mistake when Twitter announced that inactive accounts would be wiped to make their usernames available, product managers behind completely forgetting the existence of accounts of the deceased that no one has any access to. The average user won’t be able to get the assistance needed to enter their sibling’s account, and the security implications of that are still debated anyhow. When I was a rabid teenager, I completed the radical act of clearing off my Facebook friends list because I felt alone and pissed at the world or something like that. The only person I kept on my list was my grandfather who had passed, his account barely at a few hundred friends, years before the company introduced its memorialization feature. My mom used to pass her phone to me, asking me to make some texts for her. I’d see that conversation with her dad still propped up on the list, hundreds upon hundreds of undelivered messages––even after his passing.

Everything saved will be wiped away. We overestimate the importance of our words. The things that matter most to us are likely not what’s preserved by another. Self-preservation in the age of internet decay must then become priority. The average individual like you and I will reach in, nothing coming out. This is the lie of the internet, then.


The Rise of Undead Content

Low resolution, overcompressed images littered with artifacts are almost constant news feed items despite technological evolution begging for us to drop it. “Needs more JPEG” is entering both our standard ironic vocabulary and visual language, so much that a new realm of meta-meme (see: deep fried memes) has spawned using artifacting as another layer of commentary. Many a bored CS kid’s side project spawn tools that compete on who can shittily compress images better than the other.

The low resolution internet will continue to exist and evolve its own language while flatscreen, curved monitors become a regularity in households. This is by design. It’s harder than ever to download and save content we see online, no matter how well-crafted or beautiful it originally is. (It’s even harder to parody things nowadays. Thankfully, YouTube Poop is difficult to miss.) Instead, we resort to third-party or system tools to capture things: asymmetric screenshots, Twitter video downloaders run by bots that store things for 24 hours, Youtube2MP3 compressors passing on the same source code and CSS still identical from a decade ago until they get shut down and revive with a swapped domain name a month later, iPhone screen recordings. If reposting is an option at all, it’s harder to embed information and commentary on things: representing the content without a share changes things completely algorithmically. Sharing menus are complex and the previews look worse. The goal of all content in the internet is to directly correspond with someone. For many, this may be intuitive of a view on the feed as the first-viewer, but for others, this communication best takes effect in messaging platforms and emails––which reposting/resharing content (for networks they may not even be on) is futile. We continue to design platforms that make sharing and consuming content ridiculously difficult to supposedly pull users onto it when we know this will always be met with resistance, and may even be causing friction in the adoption process. Such a natural action becomes a frustrating hurdle by design.

Tumblr post criticizing the stereotypical portrayal of gay men in media. |  Download Scientific Diagram

When content takes the form of a repost, reupload, re[something], we’ll of course see more artifacting, and will also likely see no link to the original piece (sometimes unnecessary––unless there’s comments/replies). I’ll commonly see collections of Tumblr posts and threads on my Twitter or Facebook feeds with simple screenshots like this.
Locating the source content is impossible. hetcisphobia as a user is gone (we could potentially do a namesearch and trace their previous usernames, but that’s not likely going to work), and Tumblr search is so broken that it’s going to be impossible to find this even if we have all of the contents. Within at least 21,883 notes, we lose so many more layers of information: notes on Tumblr can mean standard likes and reblogs, but each reblog can also contain their own tags that are generally used for commentary too, and notes themselves can be comments (not threaded to the original text), or reblogs with replies added (threaded to the original text).

A graphic I made on Instagram reposted by a Facebook page, hitting 128,000+ shares

Worse instances of this happen in today’s culture of Instagram activism and social media journalism. Reputable news outlets continue to put up paywalls, and links are discouraged from sharing on platforms like Facebook and Instagram. (On Facebook, posting a link with preview makes your accompanying caption unshareable unless the intended sharer enters a specific menu; Instagram only gives you one profile link, links aren’t clickable on posts, and you need at least ten thousand followers to include swipable links in your stories.) In response, graphic designers condense information into swipable squares and story formats, trying to put all key points on the image.
In this example, I failed at adding sufficient context to this call for justice on Fabel Pineda’s case. The second accompany graphic that the Facebook page didn’t share also deals with police brutality numbers in the Philippines compared to the United States. When images are reposted, the essence of materials like this go to waste: there are no calls to justice or action items, it becomes material for anger or desensitization at worst––becoming material for sensationalism on social media at best.

During the 2020 United States Presidential Elections, Republicans began circulating claims about the fraudulence of absentee ballots that tend to lean left (since you know, sane people would rather not vote in-person in the middle of a pandemic) –– Democrats must be manipulating votes, letting dead voters cast in ballots! Screen recording videos like the above circulated Twitter. It was clearly convincing and without manipulation: there was a census record, address, obituary, and full name of a voter that should have passed long ago; inputting these details on the ballot checker however brings up that they passed something in! The screen recording must have been proof –– but of course, the claims were unsubstantiated, attributed to clerical error, and also ignorant of the fact that if manipulation was going on it would probably be more subtle than this.

We do know however that the screen recording is for now, a window of sincerity and truth in an easily editable internet. The recorded swapping between applications to go to his chat makes it look more genuine, even if you didn’t follow the same steps on the website he shared (and I’d gauge that many believers did not have to replicate the same steps to believe in it). This is the same reason why taking a photo of your computer screen when you’ve done an “Inspect Element” edit adds another realm of believability, even if there’s really no difference in the layers of manipulation done. Touches of human interaction with digital interfaces breathe life and realism into our engagement in the cyberspace: Instagram story pen markers over screenshots and collages make things more engaging, skewed croppings as if we’re slicing and presenting specific parts of content to our friends. Representing material as a more intentional capture than a mere link tells us of new behaviors, a gesture of “you don’t have to leave our conversation” or “see what I want you to see from this article.”

How do digital cameras work? - Explain that Stuff

While there’s a more genuine sense of realism that thrives when humans touch and represent digital content, we also lose context. The issue with digital journalism and the prospect of civic engagement comes to this: the nature of mass-scale reporting, or any type of reporting that deals with facts, is impossible to navigate with the way humans communicate and deal information. We love anecdotes, stories, personal retellings, and encounters. Exaggerations done right are a love language, facts do not move us. The more viral and worthy of telling something is, the more it has been reposted and lost of its original content; someone on the internet will always say it louder, and this retelling is a necessary evil to reach the largest audience. Inherently, this poses problems with reporting that attempts to link to posts and trace public discourse, instead turning very real words and commentary (even if they come under a veil of anonymity) into quotation games and blanket statements.

Technology gives us countless platforms and tools to make our voice heard. Words that surface the earth and are accessible in a click, no publisher needed on the most famous platforms and reels, yet we regress into the same flaws of the broadcast and broadsheet.

Content disappears faster than ever. This is the truer reason as to why reposting is barely viable. We have these traces of information and sayings, but without source links, they remain of questionable authenticity. Tumblr urls and posters die, anonymous names and erratic usernames leave us unknowing of where they were off next. Other platforms are less forgiving: Removeddit can preserve most Reddit threads (including the poster usernames of deleted comments) if they’re not removed immediately, but Chrome Extensions exist to nuke your entire history better. Privacy settings shut off people’s trails completely. Twitter used to show the usernames of private accounts that people were replying to, but now they’re obscured. For others, their existence is nefarious and ghastly as getting quote retweeted by a private account is untraceable; all you see is the little numerical indicator of your tweet rising with nowhere else to look. The Reddit thread that infamously misidentified the Boston Marathon Bomber, a worldwide wreck, was cited endlessly by journalists praising the work of anonymous internet sleuths––until the real suspects were released and their unfortunate target revealed. Edited, manipulated screenshots have the same authenticity as the deleted real unless the investigator looks closely to find traces of human intervention, Photoshopping, or sheer impossibility of origin. While many comments on the /r/WTF thread were deleted, others still remain as a grim marker of how both the internet must be preserved for its wrongs as it is with its truths.


Of the age of no icons and link rot

Link rot is escalating as we move into an internet age that respects the policing and moderation of content, moving to avoid the atrocities that occurred with the 2013 Reddit incident.

John DeFoe took a look into his fourteen-year old abandoned blog, a format any Livejournal, Blogger, or Tumblr kid should be familiar with. A little like our social media feeds today, but blogs have a more tangible, spatial component to them as you occupy your own part of the web. While DeFoe’s domain registration had long expired, Google had been pretty good about preserving Blogger content. He dove into his posts, seeking the fate of his hyperlinks.
He found that only 18% of external hyperlinks still resolve properly. While links to viral videos were generally dead (DeFoe’s blog was a part-time viral-video curation site), he could easily retrieve them and find another copy. Chocolate rain, Rickroll, Nigahiga’s early videos… these are just the tip of the iceberg of the internet’s cultural gifts that stand the test of time and still last, forever.

For personal content exchanged between friends and family, this is a different issue.

Theoretically, large tech companies have access to everything, really. We’re warned that nothing deleted is really ever deleted. Attempting to delete a social media account is pretty difficult, frequently reversible or padded with weeks of waiting before the deed can be done. These come as no surprises when data has surpassed oil as the world’s most valuable resource.

Yet even in the case of criminal investigations, state investigators frequently conflict and collide with tech companies that have been designed to track every trail and keystroke, predicting our behaviors, wants, and needs, knowing how often we stay in one location or the other, visualizing our waking hours and desires. Data is “out of jurisdiction”, data is a tremendous problem, data must also be protected from ill-intentioned law enforcement agencies. It’s an enormous, painful grey line that we constantly tread.

When we look at our own personal situations and relationships with technology, we realize our profound unimportance and simplicity. There’s little chance that a tech company will surrender information to us or point us somewhere; all it takes is a copyright strike, a forgotten password, a deleted Photobucket… everything saved will be wiped away, truly. Recovering your childhood Neopets account is a grand heist in itself, articles about recovering your old Myspace account get more attention than present-day Myspace itself, and unless you remember your 38th private Twitter username change –– good luck finding it at all.

When I was twelve, I hosted and administrated a Minecraft server for my friends. It was one of the go-to servers for the school, and I was ridiculously proud; LogMeIn Hamachi presence and all because I couldn’t portforward our IP address. One Saturday, we gathered together to create a server “teaser” video to the entirety of Matchbox Twenty’s “Let’s See How Far We’ve Come”–– a masterpiece, really. It’s gone.

It must have been stricken down due to a copyright issue and navigating my twenty plus or so Google Accounts brought me no luck. It’s the most heartbreaking thing. I remember putting it together painstakingly on a cracked version of Sony Vegas Pro, recorded with a cracked version of Hypercam (Registered, this time), and drowning it out with video effects, gifs, and blending modes. I even timed in subtitles.

It seems minor, but it’s also one of the truest records of memories I have with my friends at age twelve. If it turns up again, I wouldn’t know what to feel.

No one really expected Twitter and giant tech companies en suite to have deplatformed the President of the United States, signaling a new era on voice, platform, and the limitations of cyberjournalism. However, like Ted Cruz’ Twitter likes, Trump’s words remain as a memory to the general public despite massive documentation and archival done (since they are records of the President’s words). Words that could anger and incite a world in seconds without care for geographic boundaries turn into

Attention and public interest also isn’t enough to save content. The use of embeds as citations are common practice in news articles. Embedding a tweet allows us to witness all the changing variables in its authorship: the growing and shifting number of replies, likes, and retweets on something as its impact spreads; the author’s photograph and name; public response and thought associated with that item. Instead of potentially screengrabbing something without all its associations and full metadata, we’re able to give our audiences power to dig deeper in––a powerful thing that no other piece of media can truly replicate. (What if you could reference a movie scene and directly see the screenplay, surrounding elements, referenced materials? What if music journals had pointers to the exact moments in songs where the hook falls apart and leaves a dissatisfied pit in the critic’s stomach? What if the television was interactive and could show you all the progress updates on a report and let you swap to the weather if you need to give it a quick check? Wait…)
Twitter’s ban of Donald Trump––quite possibly the most dangerous social media user in history so far for the sheer magnitude of his reach––led to ultimate case of link rot across the internet. Even with his rhetoric, it was impossible for me to comprehend that this would have ever been an issue: he’s a public figure, his tweets are safeguarded and archived, and he’s likely one of the if not the most quoted and known figures in the waking world. A single statement impacts hundreds of nations.

Even if something survives the internet’s decay, the loss of context may as well render it useless. We can pass down knowledge with anecdotes like it were tribal or salvage screenshots, either end putting us at odds with the constant dichotomy presented by technology. In Are We Human?, Beatriz Colomina articulates this perfectly: we face both the magnificence of technology as the most human thing as we are the only species that creates tools to make tools––reinventing ourselves with artifacts, and also the wrath and dangers of technology in its wiping of us, and perhaps, in its existence and our misjudgment. Tools will remain as tools.

Undead content exists at the precipice of being erased, manipulated, forgotten, or untraceable. To the online citizen, it could very well be any of them. This doesn’t stop it from dominating the web. As the internet mimics the real world, content is more meaningful when we turn them into abstractions, ideas, and passive remarks. We only scrutinize content that doesn’t serve our our agenda or interests. Humans simply don’t have time to wade through what the internet has given us, but we encounter the web and its conversations with an unparalleled amount of trust. Even if links and stories are long dead, we retell them in even more comments, paraphrase them terribly on blog posts that will be dissected for SEO farms, and find them once again in its JPEG glory somewhere in our emails. They are inescapable: a matter of the internet shaped by us, for us, glorified by us.


Beyond content

My biggest frustration with today’s internet is that its design and structure innately awards sensationalism, maintaining no standards for preservation The movement towards curation over present-day’s influencer domination reflects a necessary need to both free us from information overload to the historical model of putting the work of the digest into the hands of select curators, and the larger community at once.

  • Many review websites aggregate the tastes of individuals into larger lists, letting you traverse collections to individual taste. A Letterboxd account serves as a user’s complete diary and review collection, be it casual watcher or IndieWire critic. Both collective ratings and individual reviews tell the story: when purchasing something on Amazon, you need to look into both numerical ratings and individual reviews.
  • Superorganizers writes of Robert Cottrell, the world’s “most read” person who writes a newsletter that seeks out the five best articles of the day. To accomplish this, Cottrell readies his iPad and pulls in hundreds of publications in his personal RSS feed (a practice that in itself is dying), article strippers, and the usual front page readers like Reddit and Hacker News. As he admits, he skims, speeds, and digests content to pull this feat off, missing out on materials that don’t immediately hook you in the first few lines. Many of the greatest pieces of writing will be missed by acts like this, but it at the very least handpicks more effectively than slow, conscious reading––though as he produces selections of five of the “best” reads each day, are these really the ones that instigate, question, and provoke?

Like any curative act, there will always be extremes, and averages are called that for a reason––they’re not very interesting. Ideally, a balance should be stricken somewhere between the individual voice and the collective’s want; we must live in a world of no idols, while also grasping the entire spectrum. How do we resist turning the internet in all its magnitude and grandiosity from falling into the same pitfalls of the printed word?

Headlines accompanied with pretty illustrative pieces offer us little about what truly matters: the links and networks intertwined with our content. Social media feeds are even worse: before Twitter introduced its quote tweets panel (which we were still suggesting as late as 2019) — the only option to investigate any ensuing conversation from a tweet was to copy its URL, manually strip the UTM code tacked on, and paste it into the Twitter search bar. It took years for Twitter to implement this functional move when billions of tweets had long been making use of this feature. Today, we see the company experimenting with people to forcibly encounter the Quote Tweet panel (where the original source is viewable and highlighted) before retweeting any item, and any option to disable replies (such as the most constricting: only letting mentioned users reply, usually done on tweets with no @s) still allows for the quote tweets. Today, we see chains of Trump’s deleted tweets as bygone records. Alt tweets and mistakes slip by and become material for the rumor mill. Historians will be digging into the age of society that lived in these moments, thinking that they would be preserved and documented––until they weren’t. The whiplash we experience when realizing that all our online records that were supposedly digital and near-eternal is jarring, decentering us from conversation; when we exist in real life, we at least have physical manifestations and signals that let us latch onto memories better, have interactions that confirm vivid encounters, and intuitively turn to paper or the net to write things down. Yet, the internet doesn’t have this. We’re navigating a lull of the web where environments are both ephemeral until they aren’t, where we have no control of the ever-changing conditions of the cyberspace. This is the internet of the dead, of false memories, and gaslighting. I live my youth in pixels and lose it as quickly as it happened.

Preserving the web

Envisioning the future of recordkeeping is terrifying in the internet’s current state. Millions of tweets, posts, and comments are defunct. World-changing broadcasts wither, unrestored. We were promised that the move from analog to digital would allow us to preserve moments, but it seems like this is further from the truth when we struggle to follow a day’s conversation in an expanse of broken links.

There are many efforts in academia that currently exist to better preserve the web for historicity. You may be familiar with some of them.

  • Perma.cc focuses on helping scholars, journals, courts, and others create permanent records of the web sources they cite. It’s

Tools and initiatives remain at that. Independent efforts and initiatives to preserve the web are only so effective. It will require massive cultural and practice-based efforts to reform the web into a joint effort of preservation––a responsibility that designers, journalists, and humans alike must share. We use the web in detached, disjointed phases where our communication methods do not match up with our tools, designers have a long way to go in balancing security, product delight, and archival. Mindfulness from several parties is the only way to go to bring back the eternal side of the internet.


By design

Like how the web is currently architected to make sharing content difficult while simultaneously producing tools that make creating original content easier than ever, we need to better investigate both our human relationship with communication along with design’s role in constraining us from free, information sharing. Because humanity has never been able to communicate and exchange thought at such a rapid pace, we may need a radical rethinking of the standards and practices we employ when referencing ideals to better use tools to their maximum potential: to archive, restore, remark, and preserve––respecting both provenance and posterity––using full use of the tools we have developed, with intentional awareness of what these materials could mean.

The goal of the human user (all of us) and of the designer and all the historians, journalists, and communicators within must converge in the name of preservation and longevity––of reanimation of the web for the space it can fully be. Our tools must work towards inching us closer towards enabling a future that respects the content that we can so easily craft, and man must take extra steps to propel this treatment of content to the mainstream.

The role of designers

On Reddit, text posts follow a very standard format. A 300-character limit title, a markdown text body. Different communities spin this format around: /r/askreddit limits the larger text body and restricts the questioner to the very title box, people with questionable moral compasses in /r/AmItheAsshole start off each thread with the characteristic “AITA” before writing biased paragraphs on real-life messes. Each subreddit has its own unique cultures, norms, memes, reposts, and drama, carried on by tens of thousands of visitors each day who absorb and partake in their favorite niches.

To better serve each subreddit’s purpose, moderators impose bots, tagging systems, and titling standards that play on this raw text post format. /r/tipofmytongue that features people struggling to name something they do their utmost best to describe is an excellent example of how subreddit moderators, under reddit’s constraints, play on new ways to make their communities work. The interesting part is not only the raw text post format, but how they’ve changed the comments system to make it easier to reward contributors and solvers, and easily get people to the correct answer.

a /r/tipofmytongue post
  • Each post begins with [TOMT] and the format of the ask, in this case a [MOVIE]. Narrows it down pretty quickly, and subreddit frequenters who have established expertise in specific areas (be it web games or early 80s romcoms) can easily track things down.
  • When the ask is solved, OP responds with a “Solved!” and a bot promptly changes the post tag and comments a direct link to the comment with the answer.
  • Point is given to the solver. User flairs (the gray box next to usernames) contain TOMT points, indicating how many times the user how solved someone else’s ask. It becomes sort of a gamification system.
  • Additionally, in response to a flood of questions being asked and abandoned, OPs need to comment on their own post before it’s visible on the subreddit.

It’s not a perfect system (what if several people respond with the right answer and OP awards the latest commenter instead of the first?), but it shows how these communities have designed systems under their constraints.

Other communities like /r/AmItheAsshole to gaming communities mashing together megathreads of questions, leaks, and official announcements do all they can, constantly exchanging practices and bending the limits of subreddit CSS to establish these customs in their community.

  • Doing updates on posts that garner a lot of attention is a pain, there’s no standard for backtracking on links. As accounts disappear and moderators scramble to verify questionable users, offering any sort of progression on subreddits is a nightmare. (It’s a common joke that Reddit search is flaming garbage, and it is.)

To make content more accessible, we need to encourage an era of design that prioritizes longevity and cross-functionalism.

  • It’s too hard to link to content we see on apps, and the amount of websites that gate users–requiring them to download or create an account before they can see material–is ridiculous. Look at Pinterest’s obnoxious home gate, or the amount of mobile websites that prompt you to download the app to look at one piece of content. Content will thrive on the internet when material can be cross-linked and referenced easily either on web or mobile. A counterexample: TikTok and Vine thrive because they’re spread everywhere, still maintaining their brand identity with their characteristic format and logo on the original material. It’s still easy to share and view them; the word of mouth spread and subsequent cultural significance makes up for pushes to optimize downloads.
  • Reject platforms that destroy the basic building block of the internet: hyperlinks. To reject hyperlinks is to reject coherent thought, discovery, and exploration. Platforms that restrict us to linking to only other social media sites are the bane of today’s existence. Today, entire companies are founded on monetizing the one-link slot we’re given on platforms like Instagram. Beacons.ai, Linktree, Carrd, and other ridiculously simple site builders are beginning to see use by content creators, corporations, and mass organizations alike. Why did we base the web around hyperlinks then decide to do away with them?
  • A conversation is multi-directional, but social platforms aren’t designed to reflect this. We backtrack, stumble, stutter. (See how Honk treats real-time messaging – with no “Send” button at all.) We navigate several threads of linear conversation at once on different platforms with a friend, with the acquired context of all that bystanders won’t comprehend.
    When these types of conversations are elevated and become reference material, things get complicated. We need new modes of citation and cross-referencing that don’t take away from these natural flows of conversation (e.g. Twitter’s decision to be gentler with links and @s when navigating text limits). Who will be the first publication to set this standard, or offer a first take at it?
  • Bi-directional links come with a facet of social awareness, treating websites not only as linear structures or simple maps––but deep, interconnected structures and graphs. We know that a link can bring us from not only Point A to B or A to C, but A to Z, and a continuous mix of either. Webpages should be treated like nodes, letting conversations and interactions flow between different spaces in the internet. This is more reflective of human thought. Notion, Roam Research, and Obsidian are capitalizing on these modes of navigation and thought for knowledge bases.
  • Webrecorder offers a suite of open source tools dedicated to making web archival more accessible. I haven’t fully checked it out yet, but they have a wonderful list of “academic and memory institutions” (and I will now be holding onto the term ‘memory institutions’) that make use of their tools.
  • To aid humans in the act of preserving more idle moments (see: my Minecraft Server trailer is not important in historicity, but it’s important to me and my friends and unfortunately a very big defining moment of my childhood), we need to elevate content modes and information that we store. Metadata about materials needs human support to be meaningful; materials do not exist in silos and in isolation. If we design interfaces and systems that encourage more mindful, manual input and interaction with metadata, we can inspire better behavior from the humans that nurture and grow these packets of information.

The role of humans

Tools are what we make of them. The web outstretched to what we can contain. When the tools we design make it easier for us to leverage them, it’s up to us to better capture experiences and moments––as they’re ours to consume in the end.

  • We need content standards for linking towards and referencing content. While we see what Perma.cc is doing for academia and the Internet Archive’s Wayback Machine for books and webpages, there’s so much information that archival efforts for day-to-day drivel are simply unsustainable–even if they hold individual meaning. If we care about the life we live on the internet, humanity must take steps to take care of its words. Screenshots and links must be given the same level of importance. We must curate items and name them, nursing the information we save on our hard drives with as much context and metadata as possible.
  • Recognize the gift of context in our interactions. That a response gestated in inquiry, thought, and processing is far more meaningful than an immediate reaction; that trust means questioning and inquiry. When we present an article to a friend, we trust that the entire thing bears material meaning. All its hyperlinks, the publication itself, the overall topic. We delve into it deeper, returning questions and more references. Thinking of someone when encountering something means sharing a snippet, and receiving a world of wealth back.
  • Our culture of creation has never been stronger. Everyone is a producer, everyone has the means to share their talents online. My friends and family are knitters, illustrators, freelancers, writers, bloggers, tweeters, photographers––each craft as valid as the next when they’re all given the same attention span on the high-speed internet. With this, we’ve seen significant movement in the value of giving credit. Everyone is a creator or curator; reddit will mob posters who don’t source content, “OC” is as serious as ever. This is much unlike a decade ago where most people would give original pieces passing glances. Everything is watermarked, defended, sacred; production is a familiar act, and we protect because we know how it feels.

    This move towards giving credit where credit is due says a lot about an interlinked web. We know people who will trace back sources, reverse image searching their way to original creators. Provenance is questioned less. It’s still difficult to navigate with the wealth of information and how rapidly things spread––but we’re better at establishing cultures where content is respected and links are given to find pieces back to home. The easier it gets to give credit, the better we’ll be able to preserve content –– and give voice to where it started from.

    Aided by designers, Twitter makes this easy for video. People can simply hold and tap on any clip and tweet it, no saving needed. When a video is tweeted, the video’s source is present at the bottom. (See: From Xiao Updates, thank you Venti!)

Today’s wealth of information and our inability to properly utilize our technological tools in an age of rapid evolution may be one of our most human challenges. Picturing the information loss we face if we do not immediately set better standards for the preservation of the web is terrifying.

Perhaps there are is also an alternate future that treasures the web as a cursory thing: a mode of communication as brief as idle conversations we have day-to-day, situating the role of preservation to the analog. A passing interface that performs its functional role, then dissipates. Temporality as a product point or feature (disappearing messages, conversations, etc.) will need to come to a conclusion on whether everything is eternal––or if nothing is. Tese two can’t simultaneously exist on the same, centralized web we have today.

The most radical attempts to reshape the human are typically carried out under this guise of reinforcing and protecting the human. Design is a paradoxical gesture that changes the human in order to protect it.

Beatriz Colomina, Are We Human?

It’s unfortunate that the age of disinformation and decay needs a large-scale rethinking of our current engagement with the internet. No prodding or nudges will solve this. We need new tools and media that birth an epoch of mindfulness, permanence, and meaning to our interactions; tools that will reframe our thinking. What is design if not something that forcibly changes our behavior in the act of human preservation?

Parting Questions

  • When will there be a guide to best practices for archiving the web?
  • Will the giants responsible for the platformization of the web make the act of digital archival any easier for us?
  • Is it foolish for platforms like Snapchat or Instagram Stories to brand themselves as “temporary” when temporariness is impossible on our internet?
  • Should the web exist as something organic, malleable, and destructible –– or as an eternal timekeeper?
  • Is link rot more of a technological issue or a human one?
  • Do humans want to know themselves forever?

Last updated January 23

In defense of disorder: on career, creativity, and professionalism

Reading Time: 15 minutes

This post is a draft, and currently being written in public (as you can see). This block of text will be removed when it is finished.

Professionalism is a lie, build what you love, explore everything. In today’s age of creation, anyone who attempts to tell you otherwise is lying. You’ll end up seeking what you traded for the rest of your life.

My friends are starting to think about the job market. These are the most talented, selfless, and energetic people I know who will drop nearly everything in an instant to respond to something they deeply believe in. At the same time, my circles are deferential and somber; they’ve never been one to loudly put things on view. Putting together a portfolio is soul-crushing work, looking back – it’s as if nothing is worthy, and there is so much to do.
At Developh (which I started as a high schooler myself), I’m constantly listening to high schoolers dreading college applications, putting their whole selves on pause for a misjudged, glorified persona worthy of the academe. Trying to tell them about worth is a futile thing, since it’s something that can only ever be learned through lived experience. Warnings are the best we can offer. That the world is a vacuum, your whole self is incredibly more potent than you can ever imagine, and like the rest of the life we all will live––this will be a game of luck and chance, skewed for those without privilege.

I grew up in the age of platforms and tooling, of forced loneliness and the internet as a place of respite. Newgrounds to DeviantArt which became Twitter and TikTok, YouTube as a constant yet ever-changing as the top names became less familiar and I skipped out on phases (namely the creepypasta/indie horror game Slenderman/Until Dawn phase). To satisfy my existence, creation was a necessity. The most meaningful community I had was a group of fourteen or so teenagers from all over the world, each other’s affiliates and friends creating Pokemon fansites that offered icons, graphics, tiny pieces of writing, and maybe a Javascript game or two. Playing through Pokemon Pearl was less important than the creativity I could strew out of it: thinking about character pairings, fantasy team compositions and gamemodes, indulging in creepypasta and refining lore. My goal then was to meet a content and fansite standard that could potentially get me affiliated with my favorite site, The Cave of Dragonflies. I never did send an affiliation request and get my 88×31 gif button up there, but it was the anchoring quality marker I had––and I am the builder I am today because of it.

Image

So I became a designer because of a fascination with a verbose, mechanical Pokemon fansite creator from Iceland. Then I became a graphic designer because I loved one pairing from the visual novel Dangan Ronpa so much that I committed to creating 365 pieces of content from it. Then I continued making fansites for Minecraft, learning how to make texture packs and hosting the tiny mods I would make as I administrated servers filled with friends of friends, learning how to be a shitty leader. Somewhere in between I learned how to make flash games out of a love for Pacthesis dating sims, picked up editing on a cracked version of Sony Vegas to make anime music videos, made a website to host mathematics reviewers I wrote for school, wrote a shitty novel influenced by a love for Terry Pratchett and Daniel Keyes. Everything a result of things I loved.

Then I realized that the real world was coming faster than I had expected. I did everything I could. Sign up to conferences for things I didn’t understand, set up a LinkedIn on junior year of high school, go to events. In college, the same happened with job applications as I tried to craft myself into the ideal mold dictated by Medium articles–absolutely terrified at the prospect of coming to America and leaving jobless–I would be a failure.
A greater disservice would be becoming something I wasn’t. I tried to recruit for software engineering and miserably failed, realizing that I wasn’t in love with the games of optimization and translating specs and wanted something that could work better in spaces of ambiguity and scale with direct interaction with people––something that I was already good at, that I didn’t believe was valid. I returned to design. Then I had a terrible time recruiting again as the pandemic hit (check out my failure resume) until realizing what spaces truly meant things to me.

I’m going to spend my whole life yearning for truth. I wasted two years in college following unproven paths from people with far more privilege than me that I was doomed to follow from the very beginning. If I were honest at the things I loved in high school, I would have gone farther.
And everything I love, if I followed, would be tended further and gloriously. I dropped games because they were illogical then and now, see college students making millions off games they made in high school, establishing game studios that would put the crap that “founders” searching for problems recklessly to shame. Now, there’s a surfeit of people with the most resources and funding poking around issues that aren’t theirs to solve, infiltrating them in the name of false good.

All I needed to hear when I was younger that the things I love were worth something. We live in the age of tooling and creation. There is no shortage of platforms and resources that make it easier for us to build, but there’s also a lack of cultural understanding around that turns young people off things they’re ridiculously good at and love, along with the impending dystopic barrage of people who can tell you that there is a “right” way to build––that there is a “right” path to success.

Building is easier, but creativity is harder

We live in a weird time. Early 2021: a huge rise in the no-code movement and automation tools. It’s the time to build, they say. Individual makers are given more leverage than ever to operate on larger scales with the amount of automation tools and platforms that make tasks easier, and the number of lines of code we have to write to make everyday products have drastically been cut. People put together conferences for thousands of people, gauge traction for platforms, and build audiences of millions all without a line of code written.

This happens on social media, on WordPress blogs, on video editors that let people make fancams in minutes. We now have a collaborative, minimal Photoshop that works on the browser (Figma), and we have a literal open-source Photoshop clone too (Photopea). Cracking software and programs is easier than it was when you had to dredge through Limewire and limited websites––if you know how to do it. It’s ridiculous and only going to get better and easier.

At the same time, it’s a very confusing era to be a technologist. Talking to young students, there’s more confusion than ever about getting started with the act of building.

As software has scaled up in complexity and the amount of toolchains, we’ve mistakenly tried to shift ridiculous processes meant for larger organizations to individual learning. I don’t mean that Git is unnecessary (because I do exist with many computer science majors who know the way to game office hours and come out with perfect pset scores––but have no idea what a repository is), but the act of learning how to build a website is a bit ridiculous.
Frank Chimero’s “Everything Easy is Hard Again” details this perfectly from the perspective of a web designer. Like him, I made a Radiohead website (well, Fall Out Boy) at one point in time and still don’t know how HTML tables work. The issue that senior, experienced technologists face is the burden of unlearning and learning to follow the simultaneously cyclic and chaotic nature of web development––where often, they’ll be relearning methods that were seemingly deemed mispractice years ago. For the beginner, they face their obliviousness and lack of experience.
Now, it’s very rare that you get to open up the source code of a website and see anything meaningful from it. Styling and scripts are minified, content and all the rendering obfuscated from the eyes of a reader. Creating a website is no longer opening up a .txt file, you need to download these command line tools and work from the terminal––or, you give up on all of that and use a website builder like Squarespace that severely restricts the potential of what you can build, or leaves you to master whatever template is most manipulable.

The current generation of product designers likely started with folks who were tasked at moving things around the company website and liking that more than touching back-end systems, moving to the role with their cracked Photoshop in hand. They remain as product designers because they’re excellent at working with systems, understanding people & metrics, and influencing culture––even if their personal visual languages may not be as aligned with what trends today. Now, there’s more designers who are only designers––it’s more acceptable, really. Less potential communication with developers, but folks who can do all the Pinterest styles: the large, blocky playful illustrations that ran rampant on every homepage for a while (now inching to the 3D trend), brutalist design plastered on your favorite y2k clothing brand’s Shopify webpage, colorful stickers on black and tapered plastic wrap textures.

But so much innovation only serves a few, really. Learning and tooling drastically disservices those who truly need it: learners from countries with limited infrastructure or ecosystems. No code tools seem more trendy for the FAANG Product Manager building a gig on the side, or for anyone who already has access to tools that did this––but worse. Webflow prices are a nightmare for Filipinos––which makes me hard to recommend no matter how flexile and lovely it could theoretically be. These movements often seem like they’re lodged to support places where technology is already most pervasive when we must be creating tooling and spaces for those in most need.
Other initiatives like Codecademy, freeCodeCamp, or Harvard’s CS50 reduce this gap for free. Bootcamps are also doing this, though can oftentimes be exploitative with profit-sharing models or exorbitant pricing models with lofty promises of six-figure jobs. Still, the good parts still fall into the same traps of curricula that doesn’t match desire. Unless you know you want to develop websites and go through the same tutorial path that every other fCC participant does, you won’t get into it. I knew I fell into this field accidentally. The complexity around technology worsens this for young learners. Looking at charts and coding roadmaps is paralyzing; people fight over what programming language you should first learn instead of simply going for whatever and actually getting started.

Issues around tutorial purgatory or newbie paralysis ensue because of these. Although the common practices around my time weren’t perfect either, what I appreciated was culture and as Frank Chimero describes, general legibility over the web. Things were decentralized and documented in detail in individual blocks or over trusted documentation sites like W3Schools if additional reference were needed. I could find any interesting site I loved and take a peek into its source code, reworking things into haphazard imitations until they were uniquely mine.

If we live in an age with both accessible-but-not-so-accessible tools and structured, sometimes wishy-washy tutorials, what can we do? We compensate with creativity––the core of design practice that seems less and less mentioned the deeper we go into education or practice.
Any maker knows that no ideas left are new; everything that has been done is done. What is more important then than creativity for the present? While we’re navigating the battlefield of tooling, there’s a constant that remains: less time is needed on the actual act of mechanically designing or making. We’re adopting new theoretical models and frameworks for passing over designs, articulating changes, and making things meaningful to the responsive creator at the right stage; advances made by companies like Figma, repl.it, and Glitch make it easier than ever for us to build things.

Fresh from Dribbble | Dribbble Design Blog
Image from Dribbble

A good example of why creativity matters is the much-complained about Dribblification of design. Everything looks the same. Design patterns and recommendations are literally just imitations of one another, especially on mobile products when their interface offered us a whole new realm of play. How can things look so stagnant when they’ve existed far shorter than the browser? People joke that the developments with design systems have reduced the job of designer into playing drag-and-drop with pixels, similar to what the Visual Basic editor might have been for programmers over a decade ago.

Finding original thought, meaningful sources of inspiration, and communities that reward creativity instead of subscription to the norm are also more challenging. Process and repetitive methods of thought will come, but creativity is innate and built. Creativity is relentless, but also easy to slip by.

In Against Creativity, Oli Mould states that for the nebulous concept it is, creativity is a power, not just an ability. Instead of an inert force, creativity is a proactive task that blends both institutional and pre-cognitive knowledge, agency, and desire to create something that does not yet exist. Creativity is then better thought of as an agent of desire: what do you long to create and have the power to create, and what is something that you can create? These should be your guiding questions. We live in the age where life’s biggest uncertainty is not how we can make something, but should––and for all the limited “shoulds” we have––create something that matters to you. This is the greatest thing that people want to seize. They want to capitalize on your time, effort, and creativity: not just the mechanical means of putting something together, this is repetitive and replicable and easy to get outdone in. Rather, the original thought, creation, and work that you do that unwittingly sets new trends and movements. You wield this power, shaped uniquely yours. Be a force of reckoning.

So then: build what you envision, the tools for it are out there. Resist norms and patterns, but feel free to imitate them as you learn. Remember that building is easier than ever, but creativity is harder than you think.

Professionalism is for dummies

With the job hunt, one of the worst things I did was try to fit in the mold of a perfect applicant. The same thing happened with my college application. The essay I wrote that got me into my present university was my most heartfelt, sentimental one. Sincerity is something you can feel from even a glance, I believe. I know it from rereading the words I wrote at seventeen.

I don’t mean abandoning all etiquette when I say that this whole act is for dummies, I mean that genuineness is a virtue and rarity and matters far more than anything I know. Tweet about the things that matter to you, the things you love behind the screen, be more than a robotic presence articulating successes––we’re moving past that, if you’ve been keeping up. When our mentors are genuinely goofy or open about their fascinations, it’s easier for us to realize that 1.) your entire being doesn’t revolve around your career, and that 2.) many things in life can simultaneously be important, if not, are more important than the work we do in these temporary companies. Seeing seniors talk about their children, obsess over niche musicians, and share meaningful pieces matters.
I had grown with the assumption that you should hide your entire internet presence while jobseeking when it has only benefited me––also, it was kind of unavoidable. My life is completely traceable with a quick full name Google Search, including this blog, and I remain unapologetic about that. With it, the things I built, believe in, and share are out there. (Of course, this only works if you’re not an asshole online. Hopefully you aren’t.)

Culturally, Filipino employers are unafraid to add you on Facebook. It’s almost an expectation. People can celebrate their career accomplishments while sharing their last Facebook Gaming stream. We know how to draw lines, and the people I’ve worked with share memes and exchange banter in familiar places. Humanity and empathy matters.
Spending this journey of my life as a design student in North America, I feel ridiculously blessed and honored to have met so many incredible designers from around the country on Twitter. They’ve donated to fundraisers put up by Developh, follow both career wins and inner thoughts, and I see their takes on social justice and equity in an increasingly tense climate. Real conversations and moments are shifting from people who put their face next to their company; making career decisions feels a lot less alone. Trends spin from one place to another, many non-design. Humanity and advocacy matters.

We’re likely to not find a workplace that perfectly aligns with everything we want: any inner advocacy we might have with a perfect salary–and extra–for the lifestyle we dream for. The few who do are very lucky, and very vocally so where it frankly gets annoying. What matters instead is to optimize for growth and happiness, the actual non-negotiables in our quest of staying alive while doing good work.

What we can instead do is maximize our reach and desires so that we’re most likely to fall into places that allow us to have close to everything we want. This is why people assert what they seek, why we must talk about the things we are passionate about and invest in them, and know the people we do. The more we talk about how a place is our dream space, the more likely will people aligned with that area think of you for it. Dreaming can never happen in a silo.

More than anything, letting go of the expectation to be professional means that I am much less narcissistic. Provide, give back. My turning point was when I realized the system was so broken that I just had no energy to put up a facade; realize that the most decent people are also tired and that we’re about to go and rebuild this entire process, anyway.

There’s a distinction between self-centered ambition and shared ambition. It’s impossible to collaborate holistically and meaningfully when you fall into the former; the most meaningful and inspirational things will be built by people, together. Especially so as a young creator/technologist/student/person should we understand that our peers––even if they don’t present and have no LinkedIn profile and are living life normally and not decimating it by doomscrolling on a career website––are no less or better than us.

The biggest lie is thinking that you excel at any one skill, or special. You are terrible and likely a burden to any professional, capable team you’ll be in. More likely than not, believe that you will be a waste of money. You are an investment.
Your personality, learning ability, and growth mindset are far more meaningful in the career search. This is why creativity and genuineness matter; they’re presumably built up, and are “unteachable” compared to technical skills that will be ever-evolving anyway. Tone down with grandiose descriptors; it matters more what you wish to be and how you’ve proven you journey into things that you’ve desired before––no matter how unacademic or unprofessional they might seem.

But this doesn’t mean that you’re worthless. It means the investment placed into you (plz don’t think I mean salary is bullshit or whatever) is more on how you build, strategize, grow––abstractions that come untaught and exist at different levels of lived experience. Like how we must treasure creativity, we must try to let go of inevitably measuring ourselves with technical markers at such early points in our careers.

All you do is all they dream of

All this leads to this thought: – holy shit, just do what you love. They’re worth something.

All these people so badly want to replicate the skills you have.

When you load up a Minecraft world you’ve been building for months with friends, spend hours exchanging fiction pieces with the most bizarre tags on Archive of Our Own, successfully set up the second largest fan group for (Sandy) Alex G, know how to Photoshop yourself into a shitty selfie of Robert Pattinson, or have played around with ROM hacks even if it was only as far as to change up the first dialogue screen of Emerald, you are creating. The commissions you do on the side turning vague screaming into gorgeous pieces on tight deadlines, Discord the posters you make for fun, the anime edits and fanart animations you produce that seemingly have no professional value––keep doing it. Break the internet and build things even if they’re ridiculous to describe on paper, make things that you are genuinely fascinated by. Running a fansite propelling constant updates on Saoirse Ronan to tens of thousands of followers consistently is a better signal of grit and potential than a fake nonprofit started in high school that will soon die out (even if everyone is doing it). Put together that fanzine and die over logistics and merchandise together, overexert on the things that you want to see become beautiful––less on the ones that mean nothing. Impact is also a measurement of what you yourself are personally proud of and love. Your body of work starts as early as now. You will remember the things you made out of sheer love far more than corporate products you’ll be forced into.

Grow with your friends, we are all creators. Take harsh feedback and criticism, or mold a space where you don’t have to do that. Immerse yourself in the internet’s spectrum of beliefs that you will forever be pondering about from now on. Embrace levity, don’t take things too seriously. You are the most powerful generation in existence, but the world also has barricaded itself denser than ever before you.
You will be more than fine.

Be creative. Share that creativity with people. The resurgence of tutorials delivered bite-size on TikTok leaves me ridiculously optimistic for the future. The people selling month-long subscriptions to paid professional program at a percentage of the cost make me chuckle; you can even get the full thing yourself for free, or walk into the mall and get a full Photoshop package for a dollar––once those are back up, of course.

The most prominent, meaningful thing I learned this year was that love and passion matter. Your interests matter. The things that make people laugh and love and shine, proven by numbers even if they’re unconventional, stand true. If I told my high school self to continue building projects and telling stories that mattered to me, I would have been far happier than I am today––and more successful at a craft that I’m ending up pouring all my hours to, anyway.
The biggest challenge at this stage in your life is resisting the norm and inventing your own. You are cultivated by communities that people are trying to unbundle and understand; your long reddit threads and tiny communities that have long tended close connections that these older systems and false learning systems dare of. It seems like this shouldn’t be revelatory, but it unfortunately is. Austerity is unnecessary, the pious are most chaotic. For years I knew what I wanted and discarded it in the name of something that was more feasible; the world has let the most unnecessary of things thrive––what you believe in, someone else does too. And in the name of kindness, laughter, joy, and the arts, an advocacy doesn’t have to be visible in the “sustainable development goals”. I believe that many of the people who have mattered to you at one point or another exist with significance without that constraint.

While walking a superficial route of professionalism, I realized the people I want to work with and will be working with were really the kinds of people who mirrored who I was at sixteen. You will be falling back into it. It’s only human nature for us to want to love the life we live.

This is a generation of unyielding talent with so much against it, a natural result of years of making building easier––and cultural thought on matter and value is and will always be delayed. Be yourself. Talk about what you really want to do. Personally, I’ll be living in pursuit of building better spaces for the kid I was when learning all that about flash and code. If only in the work of making creation more ubiquitous will we find people who naturally are invested in the problems they’re most suited for.


I wish I kept to what I loved earlier. I wish I had someone tell me that was a completely viable path.

I redesigned my website last week to feature mangacaps and anime. It’s the most fun I’ve had with a portfolio I’ve probably spent dozens of hours working on over the past months––and it’s the best it has ever looked.

Test Stream

Reading Time: 2 minutes

Trying something I’ve wanted to do for a long time. Streamed Genshin Impact dailies today, not even my core group of friends follows yet (RIP). https://www.twitch.tv/hotemogf

  • Maybe I can just stream whenever I play videogames… mostly compelled by how I consistently play Genshin Impact. To put an angle to it, Genshin Impact (especially when doing abyss runs) with a lore-focused approach to discussion – I am making my first fansite in over a decade dedicated to its lore, after all. Communities that revolve around something new, like a podcast and constant story in a stream of consciousness form.
  • Indie games! I play a lot of them and would like to record my runs and share them with people. I always recommend games that many people have never heard of, and have humble bundles and libraries to sift through and enjoy. Finishing stuff in a sitting is great for me.
    I want to rerun Papers, Please and Stephen’s Sausage Roll (among many others), go through the Sokpop collection.
  • My Dwarf Fortress world… (this one I’m afraid of actually being too stupid to stream for)
  • The PvE Minecraft server I run with friends and the banter that ensues. 🙂
  • Design streams with my personal side project and experimental work, and some productivity/educational work streams with Developh (Science & Technology, hello) –– we’re planning on creating some sort of streamer network to encourage creativity, connection, etc.; potentially discuss trends as we write and prepare projects and resources.

Is there anything you want me to stream?