Allen Pike 2015-07-04T09:49:02-07:00 Allen Pike The Supply-side Blues 2015-06-30T18:00:00-07:00 Allen Pike <p>Brent Simmons recently published a piece <a href="">on the advent of building indie iOS apps for love</a>:</p> <blockquote> <p>The platform is awesome. We love writing iOS apps. It’s fun and massively rewarding in every way except monetarily. As a craft — as a budding art form, perhaps — it’s juicy. […]</p> <p>Write the apps you want to write in your free time and out of love for the platform and for those specific apps. Take risks. Make those apps interesting and different. Don’t play it safe. If you’re not expecting money, you have nothing to lose.</p> </blockquote> <p>Like it or not, indie apps are becoming like indie games and web pages: markets that behave more like art than technology.</p> <p>Programming hasn’t traditionally been thought of as a medium of creative expression. AutoCAD wasn’t written for the love of user experience. Windows’ developers weren’t in it for the art. At least, I hope not. Software has historically been written by Real Businesses to make Real Profit.</p> <p>Rather than thinking about it in terms of art, it’s been popular to consider software development through the lens of engineering. How can we, as a discipline, learn to solve the computing problems of the world in an efficient, reliable, and predictable way? How can we ship fewer defects in less time? How can we mathematically prove that this code is correct, assuming <a href="">a spherical cow in a vacuum</a>?</p> <p>Yet as crucial as business, profit, and spherical bovines are to the field of software, in modern times we have seen the rise of a new kind of indie software creator. These folks create software as an outlet, polish it as an obsession, and release it as a form of expression. These people create indie games, fan pages, interactive art, and now, apps. If you’re in the business of selling apps, this is problematic.</p> <h2 id="creative-expression-now-with-in-app-purchase">Creative expression: Now with in app purchase</h2> <p>Now, having people around the world creating something for fun doesn’t necessarily cause a problem. Hobbyists enjoy the process of making a thing without necessarily aspiring to sell it. A hobby guitarist can be heard at their home, and that’s it. An indie guitarist, though, can be heard on YouTube - along with a million other indie guitarists. Any creative or artistic outlet, when coupled with the internet, creates an astronomically large supply - usually higher than any market could support.</p> <p>This issue arises with all forms of art. Even when parents see their children excel creatively, they’re of course proud, but often become wary. Your tuba playing sure is nice kid, but how about you keep your grades up so you can get a real job one day? Sure, it’s kind of awful to discourage a kid from their wild dream of being a professional tubist, but you know what? They’re <a href="">gonna have a bad time</a>. The cliché knows best: artists starve.</p> <p>Musicians can’t get a record deal, actors move to LA just to serve tables, and I’m pretty sure painters are legally prohibited from making a decent living until they die. Writers, dancers, comedians, creators of expansive postmodern installation pieces - everybody who tries to make a career out of doing it for the love will struggle. <a href="">The vast majority of them, unfortunately, can’t sustain it</a>.</p> <p>Most frequently, folks end up with some middle ground between the love and the bills. There’s <a href="">a classic Ask Metafilter answer</a> on the topic of how to spend your time doing what you love. While it’s worth reading in its entirety, its approach is a common one among artists of any kind:</p> <blockquote> <p>I am a director, but I’m not a working (as in paid) director. To pay my rent, I have a “day job.” I COULD work as a director, but I’d have to direct plays that I don’t want to direct. For some people, that would be fine. For me, it’s not a good trade off. I’ll be more happy with the day job and the ability to direct whatever I want – forgoing pay.</p> </blockquote> <p>There’s a reason that funding for the arts is a big issue - there is such a high supply of people who want to create, and not enough market demand to match. There are so many people that want to publish a cover of their favourite Beatles song that nobody’s going to pay you to do yours, although I know your rendition of Blackbird is a unique and special snowflake and would love to hear it later.</p> <h2 id="the-love-era">The love era</h2> <p><img style='max-width: 100%' src="/images/2015/monument.jpg" width="300" /></p> <p>Of course, app development isn’t as pure a creative outlet as, say, theatre is, and correspondingly the app development business isn’t the horror that the theatre business is. There are various success stories, and an endless supply of jobs just adjacent to the “do it for the love” indie app dream, where you can build nice software for real businesses that have real revenues - or, at least, the funding to try something crazy.</p> <p>However, when expressing frustration with the current economics of the App Store, we need to consider the effect of this mass supply of enthusiastic, creative developers. As it gets ever easier to write apps, and we’re more able to express our creativity by building apps, the market suffers more from the economic problems of other creative fields.</p> <p>The good news and the bad news are the same: we’re extremely lucky to be paid to do this. In our careers as software designers and developers, we’re able to create and share things we love, and we’re able to make a decent living. With luck, we’ll still be able to do both at once.</p> Fix your Mac with one weird trick 2015-05-31T18:00:00-07:00 Allen Pike <p><img style='max-width: 100%' src="/images/2015/pentium-snail.jpg" width="300" style="margin-bottom: 0" /></p> <p>Recently, my 13” Retina MacBook has become very slow. It was never a snappy machine, mind you. Even brand new, its integrated graphics could barely handle the display. When I decided against returning it in 2012, my review summary was “<a href="">Awkward</a>.” Three years later, it still feels like a compromise, and <a href="">web browsing performance has gotten even worse</a>.</p> <p>Around the time Yosemite arrived though, I began to experience a whole new kind of slowness. A kind of slowness that reminded me of my Windows 3.1 days. I could just switch apps and watch as the integrated graphics struggled heroically to render a window piece by piece. </p> <p>Waking the computer from sleep became an intricate ceremony, which I will now describe in detail. First, tap the keyboard. Once the computer lights up, wait until the password prompt appears, and then wait a few seconds until the input cursor starts flashing. Now, you might think you could start typing, but this is futile - the system will ignore all input at this stage. <strong>The computer is doing something very important</strong>, and the flashing cursor is not some carte blanche to just start typing. No, to determine when the dialog begins actually accepting input, you must tap some keys (or, if at this point for some reason you’re frustrated, you may prefer to mash all over the keyboard) for a few seconds until you start to see masked characters register. Then you can delete the nonsense you typed, type your password, press return, and then go grab some coffee while your computer resumes the various invisible Herculean tasks it was performing, like having Dropbox eat 100% CPU on multiple processes for hours, or leaking 20GB of memory in WindowServer. Congratulations, you’ve woken a MacBook Pro from sleep.</p> <p>This type of thing grew tiring, but at first I tolerated it. The problems presented themselves gradually enough that I slowly boiled, like a lobster cooked in the tempting molasses that is a first-gen Apple laptop purchase. I spent some time taming Dropbox, deleting files, and killing background processes, but things got worse, not better. I evaluated newer MacBooks, but the 13” Pro benchmarks are barely any faster, and the new MacBook One is actually slower. Things looked grim.</p> <h2 id="desperate-times">Desperate times</h2> <p>Frustrated, last week I discovered something interesting. Searching around for info on the giant WindowServer memory leak I’d seen, I came across <a href="">an Apple Support forum post</a> describing the exact same problem! It had 189,000 views and 534 replies, so I knew I’d finally found something to soothe my MacBook’s suffering. Here are the steps it outlined:</p> <ol> <li><em>Disconnect external monitors.</em> Cool.</li> <li><em>Boot into Safe Mode.</em> I like it, serious stuff.</li> <li><em>Fix disk permissions.</em> Okay, that’s pretty retro but I’ll go along with it.</li> <li><em>Reset your SMC.</em> Really? This isn’t going to work is it.</li> <li>A notice that this is the most crucial step: <em>Zap your PRAM.</em> Noooooooo</li> </ol> <p><img style='max-width: 100%' src="/images/2015/mac-dummies.jpg" width="250" /></p> <p>For those who are new to the Mac platform, zapping the PRAM is an age-old tradition that goes back to the classic Mac OS days. Even as a child, I was taught that when you had weird behaviour on your Mac it was time to zap the PRAM, which would promptly do nothing. Zapping the PRAM is number one on the list of desparate stuff to try on a misbehaving Mac that usually doesn’t fix the problem, outranking the trusty disk permissions repair and the perky newcomer, resetting the SMC. Zapping the PRAM is folk magic.</p> <p>Yet still, I did it. I don’t know why I did, knowing it wouldn’t work. I guess it’s just a sign of how frustrated I’ve become with the modern deluge of software issues, and how desperate I was for a computer that just worked okay. I combined all three infamous Mac troubleshooting tricks into one leap of faith.</p> <p>The weird thing was that it worked like a charm.</p> <h2 id="troubleshooting-fatigue">Troubleshooting fatigue</h2> <p>There was once a time where I wouldn’t have endured months of worsening computer performance. I would have promptly blocked out a day and tried every solution I could find or think of. I would have reformatted my machine, tinkered with the running processes, and done whatever it took to keep my pride and joy running smoothly. In 2015, I couldn’t even take the time to bring it in to the Apple Store. To some degree that’s just because I’ve grown up, and I’m less focused today on computers and more just interested in what I can make with them. I think moreso though, it’s that there are now too many computers that want my care and attention.</p> <p>I have issues I’d like to sort out on my laptop, my desktop, my phone, my tablet, my backup appliance, and even my damned watch. Don’t even get me started on the satanic being that has possessed our Apple TV. Thinking about all those various problems at once, it’s easy to feel like Apple’s software quality has declined, but I’m not sure that’s the case. The quality could even be twice as good as it once was, but when everbody has half a dozen devices, each with its own operating system, bugs, and updates, a small number of issues per device adds up to an intolerable mess.</p> <p>As we own more and more computers, they need to actually get more reliable, even as they handle the added complexity of talking to each other. Today, people are far less likely to have the bandwidth to dig in and troubleshoot a problem device - even if all it would take to fix it is one weird trick.</p> <p><em>Update June 2015:</em> The fix didn’t last - the dreaded wake from sleep issues recurred after a week or so. I suppose El Capitan is our only hope.</p> User agents of change 2015-04-30T18:00:00-07:00 Allen Pike <p><img style='max-width: 100%' src="/images/2015/ie-edge.jpg" width="250" /></p> <p>Yesterday, Microsoft released a preview of Edge, their next-generation web browser. Edge’s new rendering engine brings it more in line with modern layout engines like WebKit, and finally introduces a modern replacement for Internet Explorer. IE’s dark past means that millions of existing websites serve it old and busted markup and JavaScript, which should thankfully no longer be necessary with Edge’s modern engine. As such, it was time for Microsoft to revisit the browser’s user-agent.</p> <p>For Edge, they worked to remove the <a href="">gross middleware junk that cluttered IE’s user-agent</a> and simply advertise Edge as a modern browser that can handle modern web apps. With this in mind, Edge identifies itself with the following <a href="">new, streamlined user-agent</a>:</p> <blockquote> <p>Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36 Edge/12.0</p> </blockquote> <p>Short and sweet. The user-agent is more thorough on mobile:</p> <blockquote> <p>Mozilla/5.0 (Windows Phone 10.0; Android 4.2.1; <em>DEVICE INFO</em>) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Mobile Safari/537.36 Edge/12.0</p> </blockquote> <p>That is to say, Microsoft Edge claims to be every computing platform ever conceived - except for Internet Explorer. On its surface, this bold claim is surprising.</p> <h2 id="i-am-everyone-and-no-one">I am everyone and no one</h2> <p>The user-agent HTTP field was conceived in 1992 with a clear and simple purpose: let browsers identify themselves to websites. It let web developers collect stats about how many luddites were using “NCSA_Mosaic/2.0” and how many hotshots were using “Mozilla/1.0”, the Mosaic killer officially known as Netscape.</p> <p>Netscape did in fact kill Mosaic, and it did so by adding more features. By the mid 90s, savvy web developers were checking the user-agent for “Mozilla” so they could send Netscape fancy new markup but still support older browsers with plainer content. With this user-agent detection technique, developers could safely use Netscape’s JavaScript to pop up insightful alert dialogs, or serve fancy frame-based layouts that leveraged expansive 800x600 “<a href="">Super VGA</a>” displays. It was a crazy time, full of naive optimism and developers drunk on blink tags.</p> <p>In the meantime, Microsoft was busy developing Internet Explorer. As expected, they specified their user-agent string as “Microsoft Internet Explorer/1.0 (Windows 3.1)”. Unfortunately for Microsoft, and anybody who would ever need to make sense of a user-agent again, this meant IE 1.0 was served pages without the fancy Netscape functionality. These Mozilla-detecting sites made IE seem crappy - a designation it had yet to earn. The next version of IE instead shipped as “Mozilla/1.22 (compatible; MSIE 2.0; Windows 3.1)” and fixed the problem <a href="">once and for all</a>. IE users got JavaScript and frames, and web developers got <a href="">an endless cycle of pain</a>.</p> <p>In the 20 years since, every new web browser has been stuck with the same unpleasant choice. They can either fight the long, hard fight of evangelizing <a href="">feature detection</a> and battling one by one to get sites to update fragile and out of date browser detection code across the entire web… or they can just tack a new token onto the existing trainwreck and be done with it. Chrome could have simply launched as Chrome/1.0, but instead it made its debut as</p> <blockquote> <p>Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/ Safari/525.19</p> </blockquote> <p>The HTTP 1.1 spec <a href="">specifically discourages this</a>, since it further entrenches browser detection. Unfortunately, appending yet more junk to the user-agent is the least bad way for a new browser to get modern behaviour from existing websites, but still allow new code and analytics packages to identify it.</p> <p><img style='max-width: 100%' src="/images/2015/katamari.jpg" alt="Katamari Damacy" /></p> <p>And so the user-agent string has become a never-ending katamari that appends the string of every browser that was ever popular. After 20 years of rolling in more and more tokens, every HTTP request Edge makes has to include more than 150 bytes of text to simply convey that it is in fact Edge - a fact that only contains perhaps two bytes of entropy. As things stand, the string will continue to roll on and on indefinitely until it is large enough to pick up buildings and oil tankers.</p> <p>Thankfully, an end to this madness is in sight. Analysis of major browser releases over the last 20 years shows that user-agents have grown in length roughly linearly at a rate of about 5 characters a year. This pace will eventually become unsustainable, since the popular Apache web server <a href="">limits header size to only 8190 bytes</a>. With this limit in place, user-agents can only grow at their current rate for another 1608 years. The clock is ticking for browser vendors and web developers alike to work together to forge a new solution to this problem - before it’s too late.</p>