Feed fetched in 61 ms.
Content type is application/xml.
Feed is 168,206 characters long.
Feed has an ETag of "6e4193a917c1ce8b47d7c3a64ad7b9f0".
Warning Feed is missing the Last-Modified HTTP header.
Feed is well-formed XML.
Warning Feed has no styling.
This is an RSS feed.
Feed title: cdixon
Feed self link matches feed URL.
Warning Feed is missing an image.
Feed has 20 items.
First item published on 2023-06-22T00:00:00.000Z
Last item published on 2016-02-21T00:00:00.000Z
All items have published dates.
Newest item was published on 2023-06-22T00:00:00.000Z.
Home page URL: https://cdixon.org/
Home page has feed discovery link in <head>.
Error Home page does not have a link to the feed in the <body>.
<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
<title>cdixon</title>
<link>https://cdixon.org</link>
<description>programming, philosophy, history, internet, startups</description>
<atom:link href="https://cdixon.org/rss.xml" rel="self" type="application/rss+xml"/>
<item>
<title>I wrote a book: Read Write Own</title>
<link>https://cdixon.org/2023/06/22/read-write-own/</link>
<guid>https://cdixon.org/2023/06/22/read-write-own/</guid>
<pubDate>Thu, 22 Jun 2023 00:00:00 GMT</pubDate>
<description>I wrote a book: Read Write Own I believe blockchains and the software movement around them – typically called crypto or web3 – provide the only plausible path to sustaining ...</description>
<content:encoded><![CDATA[<p>I wrote a book: <em>Read Write Own</em></p>
<p>I believe blockchains and the software movement around them – typically called crypto or web3 – provide the only plausible path to sustaining the original vision of the internet as an open platform that incentivizes creativity and entrepreneurship. I’ve been investing behind this thesis for years, and advocating for it through writing and speaking and by talking to business leaders, journalists, and policymakers both here and around the world.</p>
<p>Through all that, it became clear that we need a comprehensive book that clearly explains new technologies like blockchains and the services built on top of them; how they fit into the history of the internet; and why they should matter to founders, developers, creators, and anyone interested in the history and evolution of business, technology, and innovation.</p>
<p>So I wrote that book: <em>Read Write Own: Building the Next Era of the Internet.</em></p>
<p>My thesis is that seemingly small initial decisions around software and network design can have profound downstream consequences on the control and economics of digital services. The book walks through the history of the internet, showing how it has gone through three major design eras: the first focused on democratizing information (read), the second on democratizing publishing (write), and the third on democratizing ownership (own).</p>
<p>We are on the cusp of the third era – own – so I explain the key concepts underlying it, including blockchains and digital services built on top of blockchains. The book therefore answers a common question I hear: “<em>What problems do blockchains solve?</em>” Blockchains solve the same problems that other digital services solve, but with better outcomes. They can connect people in social networks, while empowering users over corporate interests. They can underpin marketplaces and payment systems that facilitate commerce, but with persistently lower take rates. They can enable new forms of monetizable media, interoperable and immersive digital worlds, and artificial intelligence services that compensate – rather than cannibalize – creators and communities.</p>
<p>The book takes controversial questions head on, including policy and regulatory topics, and the harmful “casino” culture that has developed around crypto that hurts public perception and undermines its potential. And I go deeper into intersecting topics like artificial intelligence, social networks, finance, media businesses, collaborative creation, video games, and virtual worlds.</p>
<p>Inspired by modern tech classics like <em>Zero to One</em> and <em>The Hard Thing About Hard Things</em>, I wrote the book to be succinct, thorough, and accessible. I also distill cutting-edge thinking from technologists and founders to make it useful to practitioners. My goal was to make it accessible without watering it down. The book is meant for a range of audiences, including entrepreneurs, technologists, company leaders, policymakers, journalists, business thinkers, artists, community builders, and people who are simply curious about new technologies, culture, and the future of the internet.</p>
<p>I love reading books but believe that tech and business topics usually work better in shorter formats, which is why in the past I’ve stuck to blogging and tweeting. But accomplishing all of the above warranted a longer treatment, bringing new and different ideas together in one place. So I spent much of the last year doing this. Many of the ideas I’ve thought about for a long time but never took the time to write.</p>
<p><em>Read Write Own: Building the Next Era of the Internet</em> will be published by Random House on March 12, 2024. You can pre-order it <a href="https://readwriteown.com">here</a>.</p>
<p>Sign up for more book updates <a href="https://cdixon.substack.com">here</a>.</p>
<hr>
<p><a href="https://readwriteown.com/terminologyhistory/">More about the term and title “Read Write Own” here.</a></p>
]]></content:encoded>
</item>
<item>
<title>NFTs and A Thousand True Fans</title>
<link>https://cdixon.org/2021/02/27/NFTs-and-a-thousand-true-fans/</link>
<guid>https://cdixon.org/2021/02/27/NFTs-and-a-thousand-true-fans/</guid>
<pubDate>Sat, 27 Feb 2021 00:00:00 GMT</pubDate>
<description>In his classic 2008 essay “1000 True Fans,” Kevin Kelly predicted that the internet would transform the economics of creative activities: To be a successful creator you don’t need millions. ...</description>
<content:encoded><![CDATA[<p align="center"><img src="images/nfts.png"/></p>
<p>In his classic 2008 essay “<a href="https://kk.org/thetechnium/1000-true-fans/">1000 True Fans</a>,” Kevin Kelly predicted that the internet would transform the economics of creative activities:</p>
<blockquote>
<p>To be a successful creator you don’t need millions. You don’t need millions of dollars or millions of customers, millions of clients or millions of fans. To make a living as a craftsperson, photographer, musician, designer, author, animator, app maker, entrepreneur, or inventor you need only thousands of true fans.</p>
</blockquote>
<blockquote>
<p>A true fan is defined as a fan that will buy anything you produce. These diehard fans will drive 200 miles to see you sing; they will buy the hardback and paperback and audible versions of your book; they will purchase your next figurine sight unseen; they will pay for the “best-of” DVD version of your free YouTube channel; they will come to your chef’s table once a month.</p>
</blockquote>
<p>Kelly’s vision was that the internet was the ultimate matchmaker, enabling 21st century patronage. Creators, no matter how seemingly niche, could now discover their true fans, who would in turn demonstrate their enthusiasm through direct financial support.</p>
<p>But the internet took a detour. Centralized social platforms became the dominant way for creators and fans to connect. The platforms used this power to become the new intermediaries — inserting ads and algorithmic recommendations between creators and users while keeping most of the revenue for themselves.</p>
<p>The good news is that the internet is trending back to Kelly’s vision. For example, many top writers on Substack earn far more than they did at salaried jobs. The economics of low take rates plus enthusiastic fandom does wonders. On Substack, 1,000 newsletter subscribers paying $10/month nets over $100K/year to the writer.</p>
<p>Crypto, and specifically <a href="https://variant.mirror.xyz/T8kdtZRIgy_srXB5B06L8vBqFHYlEBcv6ae2zR6Y_eo">NFTs</a> (non-fungible tokens), can accelerate the trend of creators monetizing directly with their fans. Social platforms will continue to be useful for building audiences (although these too should probably be replaced with superior <a href="https://cdixon.org/2018/02/18/why-decentralization-matters">decentralized</a> alternatives), but creators can increasingly rely on other methods including NFTs and crypto-enabled economies to make money.</p>
<p>NFTs are blockchain-based records that uniquely represent pieces of media. The media can be anything digital, including art, videos, music, gifs, games, text, memes, and code. NFTs contain highly trustworthy documentation of their history and origin, and can have code attached to do almost anything programmers dream up (one popular feature is code that ensures that the original creator receives royalties from secondary sales). NFTs are secured by the same technology that enabled Bitcoin to be owned by hundreds of millions of people around the world and represent hundreds of billions of dollars of value.</p>
<p>NFTs have received a lot of attention lately because of high sales volumes. In the past 30 days there has been over <a href="http://cryptoslam.io">$300M</a> in NFT sales:</p>
<p align="center"><img src="images/pic1.png"/></p>
<p>Crypto has a history of boom and bust cycles, and it’s very possible NFTs will have their own ups and downs.</p>
<p>That said, there are three important reasons why NFTs offer fundamentally better economics for creators. The first, already alluded to above, is by removing rent-seeking intermediaries. The logic of blockchains is once you purchase an NFT it is yours to fully control, just like when you buy books or sneakers in the real world. There are and will continue to be NFT platforms and marketplaces, but they will be constrained in what they can charge because blockchain-based ownership shifts the power back to creators and users — you can shop around and force the marketplace to earn its fees. (Note that lowering the intermediary fees can have a multiplier effect on creator disposable income. For example, if you make $100K in revenue and have $80K in costs, cutting out a 50% take rate increases your revenue to $200K, multiplying your disposable income 6x, from $20K to $120K.)</p>
<p>The second way NFTs change creator economics is by enabling granular price tiering. In ad-based models, revenue is generated more or less uniformly regardless of the fan’s enthusiasm level. As with Substack, NFTs allow the creator to “cream skim” the most passionate users by offering them special items which cost more. But NFTs go farther than non-crypto products in that they are easily sliced and diced into a descending series of pricing tiers. NBA Top Shot cards range from over $100K to a few dollars. Fan of Bitcoin? You can buy as much or little as you want, down to 8 decimal points, depending on your level of enthusiasm. Crypto’s fine-grained granularity lets creators capture a much larger area under the demand curve.</p>
<p align="center"><img src="images/pic2.png"/></p>
<p>The third and most important way NFTs change creator economics is by making users owners, thereby reducing customer acquisition costs to near zero. Open any tech S-1 filing and you’ll see massive user/customer acquisition costs, usually going to online ads or sales staff. Crypto, by contrast, has grown to over a trillion dollars in aggregate market capitalization with almost no marketing spend. Bitcoin and Ethereum don’t have organizations behind them let alone marketing budgets, yet are used, owned, and loved by tens of millions of people.</p>
<p>The highest revenue NFT project to date, <a href="https://www.nbatopshot.com/">NBA Top Shot</a>, has generated $200M in gross sales in just the past month while spending very little on marketing. It’s been able to grow so efficiently because users feel like owners — they have skin in the game. It’s true peer-to-peer marketing, fueled by community, <a href="https://twitter.com/ROSGO21/status/1364724500642689027?s=20">excitement</a>, and ownership.</p>
<p align="center"><img src="images/pic3.jpg"/></p>
<p>NFTs are still early, and will evolve. Their utility will increase as digital experiences are built around them, including marketplaces, social networks, showcases, games, and virtual worlds. It’s also likely that other consumer-facing crypto products emerge that pair with NFTs. Modern video games like Fortnite contain sophisticated economies that mix fungible tokens like V-Bucks with NFTs/virtual goods like skins. Someday every internet community might have its own micro-economy, including NFTs and fungible tokens that users can use, own, and collect.</p>
<p>The thousand true fans thesis builds on the original ideals of the internet: users and creators globally connected, unconstrained by intermediaries, sharing ideas and economic upside. Incumbent social media platforms sidetracked this vision by locking creators into a bundle of distribution and monetization. There are, correspondingly, two ways to challenge them: take the users, or take the money. Crypto and NFTs give us a new way to take the money. Let’s make it happen.</p>
<p><em>(Image: CryptoPunks — Larva Labs)</em></p>
]]></content:encoded>
</item>
<item>
<title>Doing old things better vs doing brand new things</title>
<link>https://cdixon.org/2020/10/19/doing-old-things-better-vs-doing-brand-new-things/</link>
<guid>https://cdixon.org/2020/10/19/doing-old-things-better-vs-doing-brand-new-things/</guid>
<pubDate>Mon, 19 Oct 2020 00:00:00 GMT</pubDate>
<description>New technologies enable activities that fall into one of two categories: 1) doing things you could already do but can now do better because they are faster, cheaper, easier, higher ...</description>
<content:encoded><![CDATA[<p>New technologies enable activities that fall into one of two categories: 1) doing things you could already do but can now do better because they are faster, cheaper, easier, higher quality, etc. 2) doing brand new things that you simply couldn’t do before. Early in the development of new technologies, the first category tends to get more attention, but it’s the second that ends up having more impact on the world.</p>
<p>Doing old things better tends to get more attention early on because it’s easier to imagine what to build. Early films were shot like plays — they were effectively plays with a better distribution model — until filmmakers realized that movies had their own visual grammar. The early electrical grid delivered light better than gas and candles. It took decades before we got an electricity “app store” — a rich ecosystem of appliances that connected to the grid. The early web was mostly digital adaptations of pre-internet things like letter writing and mail-order commerce. It wasn’t until the 2000s that entrepreneurs started exploring “internet native” ideas like social networking, crowdfunding, cryptocurrency, crowdsourced knowledge bases, and so on.</p>
<p>The most common mistake people make when evaluating new technologies is to focus too much on the “doing old things better” category. For example, when evaluating the potential of blockchains, people sometimes focus on things like cheaper and faster global payments, which are important and necessary but only the beginning. What’s even more exciting are the new things you simply couldn’t create before, like internet services that are <a href="https://cdixon.org/2019/01/04/how-blockchain-can-wrest-the-internet-from-corporations">owned and operated by their users</a> instead of by companies. Another example is business productivity apps architected as web services. Early products like Salesforce were easier to access and cheaper to maintain than their on-premise counterparts. Modern productivity apps like Google Docs, Figma, and Slack focus on things you simply couldn’t do before, like real-time collaboration and deep integrations with other apps.</p>
<p>Entrepreneurs who create products in the “brand new things” category usually spend many years deeply immersed in the underlying technology before they have their key insights. The products they create often <a href="https://cdixon.org/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy">start out looking toy-like</a>, <a href="https://cdixon.org/2019/01/08/strong-and-weak-technologies">strange, unserious, expensive</a>, and sometimes even dangerous. Over time, the products steadily improve and the world gradually embraces them.</p>
<p>It can take decades for this process to play out. It’s clear that we are early in the development of emerging technologies like cryptocurrencies, machine learning, and virtual reality. It is also possible we are still early in the development of more established technologies like mobile devices, cloud hosting, social networks, and perhaps even the internet itself. If so, new categories of native products built on top of these technologies will continue to be invented in the coming years.</p>
]]></content:encoded>
</item>
<item>
<title>Computers that can make commitments</title>
<link>https://cdixon.org/2020/01/26/computers-that-can-make-commitments/</link>
<guid>https://cdixon.org/2020/01/26/computers-that-can-make-commitments/</guid>
<pubDate>Sun, 26 Jan 2020 00:00:00 GMT</pubDate>
<description>Blockchains are computers that can make commitments. Traditional computers are ultimately controlled by people, either directly in the case of personal computers or indirectly through organizations. Blockchains invert this power ...</description>
<content:encoded><![CDATA[<p>Blockchains are computers that can make commitments. Traditional computers are ultimately controlled by people, either directly in the case of personal computers or indirectly through organizations. Blockchains invert this power relationship, putting the code in charge. A game theoretic mechanism — a so-called consensus mechanism — makes blockchains resilient to modifications to their underlying physical components, effectively making them resilient to human intervention.</p>
<p>As a result, a properly designed blockchain provides strong guarantees that the code it runs will continue to operate as designed. For the first time, a computer system can be truly autonomous: self-governed, by its own code, instead of by people. Autonomous computers can be relied on and trusted in ways that human-governed computers can’t.</p>
<p>Computers that make commitments can be useful in finance. The most famous example of this is Bitcoin, which makes various commitments, including that there will never be more than 21 million bitcoins, a commitment that makes bitcoins scarce and therefore capable of being valuable. Without a blockchain, this commitment could have been made by a person or a business, but it is unlikely that other people would have really trusted that commitment, since people and businesses change their minds all the time. Prior to Bitcoin, besides precious metals which are naturally scarce, the only credible commitments to monetary scarcity came from governments.</p>
<p>Ethereum was the first blockchain to support a general-purpose programming language, allowing for the creation of arbitrarily complex software that makes commitments. Two early applications built on Ethereum are <a href="https://compound.finance/">Compound</a> and <a href="https://makerdao.com/en/">Maker Dao</a>. Compound makes the commitment that it will act as a neutral, low-fee lending protocol. Maker Dao makes a commitment to maintain the price stability of a currency called Dai that can be used for stable payments and value store. As of today, users have locked up hundreds of millions of dollars in these applications, a testament to the credibility of their commitments.</p>
<p>Applications like Compound and Maker can do things that pre-blockchain software simply couldn’t, such as hold funds that reside in the code itself, as opposed to traditional payment systems which only hold pointers to offline bank accounts. This removes the need to trust anything other than code, and makes the system end-to-end transparent and extensible. Blockchain applications do this autonomously — every human involved in creating these projects could disappear and the software would go on doing what it does, keeping its commitments, indefinitely.</p>
<p>What else can you do with computers that make commitments? One fertile area being explored is re-architecting popular internet services like social networks and marketplaces so that they make strong, positive commitments to their communities. For example, users can get commitments baked into the code that their data will be kept private and that they won’t get de-platformed without due process. Third-party developers can safely invest in their businesses knowing that the rules are baked into the network and can’t change, protecting them from <a href="https://cdixon.org/2018/02/18/why-decentralization-matters">platform risk</a>. Using the financial features of blockchains, users and developers can receive tokens in order to participate in the upside of the network as it grows.</p>
<p>Blockchains have arrived at an opportune time. Internet services have become central to our economic, political, and cultural lives, yet the trust between users and the people who run these services is breaking down. At the same time, industries like finance that have traditionally depended on trust have resisted modernization. The next few years will be exciting — we are only beginning to explore the <a href="https://cdixon.org/2013/08/04/the-idea-maze">idea maze</a> unlocked by this new kind of computer.</p>
]]></content:encoded>
</item>
<item>
<title>Inside-out vs. outside-in: the adoption of new technologies</title>
<link>https://cdixon.org/2020/01/17/inside-out-vs-outside-in/</link>
<guid>https://cdixon.org/2020/01/17/inside-out-vs-outside-in/</guid>
<pubDate>Fri, 17 Jan 2020 00:00:00 GMT</pubDate>
<description>There are broadly two adoption paths for new computing technologies: inside-out and outside-in. Inside-out technologies are pioneered by established institutions and later proliferate outward to the mainstream. Apple (followed by ...</description>
<content:encoded><![CDATA[<p>There are broadly two adoption paths for new computing technologies: inside-out and outside-in. Inside-out technologies are pioneered by established institutions and later proliferate outward to the mainstream. Apple (followed by Google and others) pioneered the modern touchscreen smartphone, university and corporate research labs pioneered machine learning, and big tech companies like Amazon pioneered cloud computing.</p>
<p>Outside-in technologies, by contrast, start out on the fringes and only later move inward to established institutions. Open-source software started out as a niche anti-copyright movement. The web was invented at a physics lab and then built out by hobbyists and entrepreneurs. Social media began as a movement of idealistic blogging enthusiasts.</p>
<p>Inside-out technologies tend to require significant capital and formally trained technical expertise. They also tend to be technologies that most people would recognize as valuable even before they exist. It wasn’t very hard to imagine that affordable, easy-to-use, internet-connected pocket supercomputers would be popular, or that machines that could learn to behave intelligently could do all sorts of useful tasks.</p>
<p>Outside-in technologies tend to require less capital and less formally trained technical skills, creating a much more level playing field between insiders and outsiders. In many cases the value of outside-in technologies is not only unclear before they’re invented, but remains unclear for many years after they’re invented.</p>
<p>Take the example of social media. Early on, blogs and services like Twitter were mostly used to discuss niche tech topics and share mundane personal events. This led many sophisticated observers to <a href="https://www.nytimes.com/2007/04/22/business/yourmoney/22stream.html">dismiss</a> them as toys or passing fads. At its core, however, social media was about the creation of curated information networks. Today, this is easy to see – billions of people rely on services like Twitter and Facebook for their news – but back then you had to cut through the noise generated by the eccentricities of early adopters. Social media is a technology for creating global media networks that arrived disguised as a way to share what you had for lunch.</p>
<p>Both inside-out and outside-in technologies are important, and in fact they’re often mutually reinforcing. Mobile, social, and cloud powered the growth of computing over the last decade: mobile (inside-out) brought computers to billions of people, social (outside-in) drove usage and monetization, and cloud (inside-out) allowed back-end services to scale. Most likely the next major wave in computing will also be driven by a mutually reinforcing combination of technologies, some developed at established institutions and some developed by enthusiastic and possibly misunderstood outsiders.</p>
]]></content:encoded>
</item>
<item>
<title>Strong and weak technologies</title>
<link>https://cdixon.org/2019/01/08/strong-and-weak-technologies/</link>
<guid>https://cdixon.org/2019/01/08/strong-and-weak-technologies/</guid>
<pubDate>Tue, 08 Jan 2019 00:00:00 GMT</pubDate>
<description>During a media tour in 2007 in which Steve Jobs showed the device to reporters, there was one instance in which a journalist criticized the iPhone’s touch-screen keyboard. “It doesn’t ...</description>
<content:encoded><![CDATA[<blockquote>
<p><em>During a <a href="https://www.businessinsider.com/steve-jobs-reaction-first-iphone-2015-9">media tour</a> in 2007 in which Steve Jobs showed the device to reporters, there was one instance in which a journalist criticized the iPhone’s touch-screen keyboard.</em></p>
<p><em>“It doesn’t work,” the reporter said.</em></p>
<p><em>Jobs stopped for a moment and tilted his head. The reporter said he or she kept making typos and the keys were too small for his or her thumbs.</em></p>
<p><em>Jobs smiled and then replied: “Your thumbs will learn.”</em></p>
</blockquote>
<p>When the iPhone was introduced in 2007, it <a href="https://www.wsj.com/articles/behind-the-rise-and-fall-of-blackberry-1432311912">mystified</a> its competitors, because it wasn’t built for the world as it existed. Wireless networks were too slow. Smartphone users only knew how to use physical keyboards. There were no software developers making apps for touchscreen phones. It frequently dropped phone calls.</p>
<p>But the iPhone was such a remarkable device — fans called it “The Jesus Phone” — that the world adapted to it. Carriers built more wireless capacity. Developers invented new apps and interfaces. Users learned how to rapidly type on touchscreens. Apple kept releasing better versions, fixing problems and adding new capabilities.</p>
<p>Smartphones are a good example of a broader historical pattern: technologies usually arrive in pairs, a strong form and a weak form. Here are some examples:</p>
<table class="comparison-table">
<thead>
<tr><th>Strong</th><th>Weak</th></tr>
</thead>
<tbody>
<tr><td>Public internet</td><td>Private intranets</td></tr>
<tr><td>Consumer web</td><td>Interactive TV</td></tr>
<tr><td>Crowdsourced encyclopedia (Wikipedia)</td><td>Expert-curated encyclopedia (e.g. Nupedia, Encarta)</td></tr>
<tr><td>Crowdsourced video (YouTube)</td><td>Video tech for media companies (e.g. RealPlayer)</td></tr>
<tr><td>Internet video chat (Skype)</td><td>Voice-over-IP (e.g. Vonage)</td></tr>
<tr><td>Streaming music (Spotify)</td><td>MP3 downloads (e.g. iTunes)</td></tr>
<tr><td>Touchscreen smartphones with full operating system and app store (iPhone)</td><td>Limited-app smartphones with physical keyboards (e.g. Blackberry)</td></tr>
<tr><td>Fully electric cars (Tesla)</td><td>Hybrid cars</td></tr>
<tr><td>Permissionless blockchains powered by cryptocurrencies</td><td>Permissioned/private blockchains</td></tr>
<tr><td>Public cloud</td><td>Private / hybrid cloud</td></tr>
<tr><td>App-based media companies (e.g. Netflix)</td><td>Video on demand delivered by cable companies</td></tr>
<tr><td>Virtual realty</td><td>Augmented reality</td></tr>
<tr><td>E-sports</td><td>Traditional sports delivered over the internet</td></tr>
</tbody>
</table>
<p>Strong technologies capture the imaginations of technology enthusiasts. That is why many important technologies start out as weekend hobbies. <a href="https://cdixon.org/2013/03/03/what-the-smartest-people-do-on-the-weekend-is-what-everyone-else-will-do-during-the-week-in-ten-years/">Enthusiasts vote with their time</a>, and, unlike most of the business world, have long-term horizons. They build from first principles, making full use of the available resources to design technologies as they ought to exist. Sometimes these enthusiasts run large companies, in which case they are often, like Steve Jobs, founders who have the gravitas and vision to make big, long-term bets.</p>
<p>The mainstream technology world notices the excitement and wants to join in, but isn’t willing to go all the way and embrace the strong technology. To them, the strong technology appears to be some combination of strange, <a href="https://cdixon.org/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy/">toy-like</a>, unserious, expensive, and sometimes even dangerous. So they embrace the weak form, a compromised version that seems more familiar, productive, serious, and safe.</p>
<p>Strong technologies often develop according to the Perez/Gartner hype cycle:</p>
<p><img src="images/researchmethodology-illustration-hype-cycle.jpg" alt=""></p>
<p>During the trough of disillusionment, entrepreneurs and others who invested in strong technologies sometimes lose faith and switch their focus to weak technologies, because the weak technologies appear nearer to mainstream adoption. This is usually a mistake.</p>
<p>That said, weak forms of technology can be successful. For example, it is very likely that augmented reality will be important, watching traditional sports on the internet will be popular, and so on.</p>
<p>But it’s strong technologies that end up defining new eras. What George Bernard Shaw said about people also applies to technologies:</p>
<blockquote>
<p>The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.</p>
</blockquote>
<p>Weak technologies adapt to the world as it currently exists. Strong technologies adapt the world to themselves. Progress depends on strong technologies. Your thumbs will learn.</p>
]]></content:encoded>
</item>
<item>
<title>Who will control the software that powers the Internet?</title>
<link>https://cdixon.org/2019/01/04/how-blockchain-can-wrest-the-internet-from-corporations/</link>
<guid>https://cdixon.org/2019/01/04/how-blockchain-can-wrest-the-internet-from-corporations/</guid>
<pubDate>Fri, 04 Jan 2019 00:00:00 GMT</pubDate>
<description>Originally published by Wired. As the internet has evolved over its 35-year lifespan, control over its most important services has gradually shifted from open source protocols maintained by non-profit communities ...</description>
<content:encoded><![CDATA[<p><em>Originally published by <a href="https://www.wired.com/story/how-blockchain-can-wrest-the-internet-from-corporations/">Wired</a>.</em></p>
<p>As the internet has evolved over its 35-year lifespan, control over its most important services has gradually shifted from open source protocols maintained by non-profit communities to proprietary services operated by large tech companies. As a result, billions of people got access to amazing, free technologies. But that shift also created serious problems.</p>
<p>Millions of users have had their private data misused or stolen. Creators and businesses that rely on internet platforms are subject to sudden rule changes that take away their audiences and profits. But there is a growing movement—emerging from the blockchain and cryptocurrency world—to build new internet services that combine the power of modern, centralized services with the community-led ethos of the original internet. We should embrace it.</p>
<p>From the 1980s through the early 2000s, the dominant internet services were built on open protocols that the internet community controlled. For example, the Domain Name System, the internet’s “phone book,” is controlled by a distributed network of people and organizations, using rules that are created and administered in the open. This means that anyone who adheres to community standards can own a domain name and establish an internet presence. It also means that the power of companies operating web and email hosting is kept in check—if they misbehave, customers can port their domain names to competing providers.</p>
<p>From the mid 2000s to the present, trust in open protocols was replaced by trust in corporate management teams. As companies like Google, Twitter, and Facebook built software and services that surpassed the capabilities of open protocols, users migrated to these more sophisticated platforms. But their code was proprietary, and their governing principles could change on a whim.</p>
<p>How do social networks decide which users to <a href="https://www.wired.com/story/how-right-wing-social-media-site-gab-got-back-online/">verify</a> or <a href="https://www.wired.com/story/tumblrs-porn-ban-reveals-controls-we-see-online/">ban</a>? How do search engines decide how to rank websites? One minute social networks court media organizations and small businesses, the next minute they de-prioritize their content or change the revenue split. The power of these platforms has created widespread societal tensions, as seen in debates over fake news, state-sponsored bots, privacy laws, and algorithmic biases.</p>
<p>That’s why the pendulum is swinging back to an internet governed by open, community-controlled services. This has only recently become possible, thanks to technologies arising from the blockchain and cryptocurrencies.</p>
<p>There has been a lot of talk in the past few years about blockchains, which are heavily hyped but poorly understood. Blockchains are networks of physical computers that work together in concert to form a single virtual computer. The benefit is that, unlike a traditional computer, a blockchain computer can offer strong trust guarantees, rooted in the mathematical and game-theoretic properties of the system. A user or developer can trust that a piece of code running on a blockchain computer will continue to behave as designed, even if individual participants in the network change their motivations or try to subvert the system. This means that the control of a blockchain computer can be placed in the hands of a community.</p>
<p>Users who depend on proprietary platforms, on the other hand, have to worry about data getting stolen or misused, privacy policies changing, intrusive advertising, and more. Proprietary platforms may suddenly change the rules for developers and businesses, the way Facebook <a href="https://venturebeat.com/2016/06/30/facebook-kicked-zynga-to-the-curb-publishers-are-next/">famously did to Zynga</a> and Google <a href="https://www.nytimes.com/2017/07/01/technology/yelp-google-european-union-antitrust.html">did to Yelp</a>.</p>
<p>The idea that corporate-owned services could be replaced by community-owned services may sound far-fetched, but there is a strong historical precedent in the transformation of software over the past twenty years. In the 1990s, computing was dominated by proprietary, closed-source software, most notably Windows. Today, billions of Android phones run on the open source operating system Linux. Much of the software running on an Apple device is open source, as is almost all modern cloud data centers including Amazon’s. The recent acquisitions of <a href="https://www.wired.com/story/microsofts-github-deal-is-its-latest-shift-from-windows/">Github by Microsoft</a> and <a href="https://www.wired.com/story/ibm-buying-open-source-specialist-red-hat-34-billion/">Red Hat by IBM</a> underscore how dominant open source has become.</p>
<p>As open source has grown in importance, technology companies have shifted their business models from selling software to delivering cloud-based services. Google, Facebook, Amazon, and Netflix are all services companies. Even Microsoft is now primarily a services company. This has allowed these companies to outpace the growth of open source software and maintain control of critical internet infrastructure.</p>
<p>A core insight in the design of blockchains is that the open source model can be extended beyond software to cloud-based services by adding financial incentives to the mix. Cryptocurrencies—coins and tokens built into specific blockchains—provide a way to incentivize individuals and groups to participate in, maintain, and build services.</p>
<p>The idea that an internet service could have an associated coin or token may be a novel concept, but the blockchain and cryptocurrencies can do for cloud-based services what open source did for software. It took twenty years for open source software to supplant proprietary software, and it could take just as long for open services to supplant proprietary services. But the benefits of such a shift will be immense. Instead of placing our trust in corporations, we can place our trust in community-owned and -operated software, transforming the internet’s governing principle from “don’t be evil” back to “can’t be evil.”</p>
]]></content:encoded>
</item>
<item>
<title>Why decentralization matters</title>
<link>https://cdixon.org/2018/02/18/why-decentralization-matters/</link>
<guid>https://cdixon.org/2018/02/18/why-decentralization-matters/</guid>
<pubDate>Sun, 18 Feb 2018 00:00:00 GMT</pubDate>
<description>The first two eras of the internet During the first era of the internet — from the 1980s through the early 2000s — internet services were built on open protocols ...</description>
<content:encoded><![CDATA[<h2>The first two eras of the internet</h2>
<p>During the first era of the internet — from the 1980s through the early 2000s — internet services were built on open protocols that were controlled by the internet community. This meant that people or organizations could grow their internet presence knowing the rules of the game wouldn’t change later on. Huge web properties were started during this era including Yahoo, Google, Amazon, Facebook, LinkedIn, and YouTube. In the process, the importance of centralized platforms like AOL greatly diminished.</p>
<p>During the second era of the internet, from the mid 2000s to the present, for-profit tech companies — most notably Google, Apple, Facebook, and Amazon (GAFA) — built software and services that rapidly outpaced the capabilities of open protocols. The explosive growth of smartphones accelerated this trend as mobile apps became the majority of internet use. Eventually users migrated from open services to these more sophisticated, centralized services. Even when users still accessed open protocols like the web, they would typically do so mediated by GAFA software and services.</p>
<p>The good news is that billions of people got access to amazing technologies, many of which were free to use. The bad news is that it became much harder for startups, creators, and other groups to grow their internet presence without worrying about centralized platforms changing the rules on them, taking away their audiences and profits. This in turn stifled innovation, making the internet less interesting and dynamic. Centralization has also created broader societal tensions, which we see in the debates over subjects like fake news, state sponsored bots, “no platforming” of users, EU privacy laws, and algorithmic biases. These debates will only intensify in the coming years.</p>
<h2>“Web 3”: the third era of the internet</h2>
<p>One response to this centralization is to impose government regulation on large internet companies. This response assumes that the internet is similar to past communication networks like the phone, radio, and TV networks. But the hardware-based networks of the past are fundamentally different than the internet, a software-based network. Once hardware-based networks are built, they are nearly impossible to rearchitect. Software-based networks can be rearchitected through entrepreneurial innovation and market forces.</p>
<p>The internet is the ultimate software-based network, consisting of a relatively simple <a href="https://en.wikipedia.org/wiki/Internet_Protocol">core layer</a> connecting billions of fully programmable computers at the edge. Software is simply the encoding of human thought, and as such has an almost unbounded design space. Computers connected to the internet are, by and large, free to run whatever software their owners choose. Whatever can be dreamt up, with the right set of incentives, can quickly propagate across the internet. Internet architecture is where technical creativity and incentive design intersect.</p>
<p>The internet is still early in its evolution: the core internet services will likely be almost entirely rearchitected in the coming decades. This will be enabled by crypto-economic networks, a generalization of the ideas first introduced in <a href="https://bitcoin.org/bitcoin.pdf">Bitcoin</a> and further developed in <a href="https://github.com/ethereum/wiki/wiki/White-Paper">Ethereum</a>. Cryptonetworks combine the best features of the first two internet eras: community-governed, decentralized networks with capabilities that will eventually exceed those of the most advanced centralized services.</p>
<h2>Why decentralization?</h2>
<p>Decentralization is a commonly misunderstood concept. For example, it is sometimes said that the reason cryptonetwork advocates favor decentralization is to resist government censorship, or because of libertarian political views. These are not the main reasons decentralization is important.</p>
<p>Let’s look at the problems with centralized platforms. Centralized platforms follow a predictable life cycle. When they start out, they do everything they can to recruit users and 3rd-party complements like developers, businesses, and media organizations. They do this to make their services more valuable, as platforms (by definition) are systems with multi-sided network effects. As platforms move up the adoption S-curve, their power over users and 3rd parties steadily grows.</p>
<p><img src="images/07lrwGIDbAYk6q7zG.png" alt=""></p>
<p>When they hit the top of the S-curve, their relationships with network participants change from positive-sum to zero-sum. The easiest way to continue growing lies in extracting data from users and competing with complements over audiences and profits. Historical examples of this are Microsoft vs. Netscape, Google vs. Yelp, Facebook vs. Zynga, and Twitter vs. its 3rd-party clients. Operating systems like iOS and Android have behaved better, although still take a healthy 30% tax, reject apps for seemingly arbitrary reasons, and subsume the functionality of 3rd-party apps at will.</p>
<p>For 3rd parties, this transition from cooperation to competition feels like a bait-and-switch. Over time, the best entrepreneurs, developers, and investors have become wary of building on top of centralized platforms. We now have decades of evidence that doing so will end in disappointment. In addition, users give up privacy, control of their data, and become vulnerable to security breaches. These problems with centralized platforms will likely become even more pronounced in the future.</p>
<h2>Enter cryptonetworks</h2>
<p>Cryptonetworks are networks built on top of the internet that 1) use consensus mechanisms such as blockchains to maintain and update state, 2) use cryptocurrencies (coins/tokens) to incentivize consensus participants (miners/validators) and other network participants. Some cryptonetworks, such as Ethereum, are general programming platforms that can be used for almost any purpose. Other cryptonetworks are special purpose, for example Bitcoin is intended primarily for storing value, <a href="https://golem.network/">Golem</a> for performing computations, and <a href="https://filecoin.io/">Filecoin</a> for decentralized file storage.</p>
<p>Early internet protocols were technical specifications created by working groups or non-profit organizations that relied on the alignment of interests in the internet community to gain adoption. This method worked well during the very early stages of the internet but since the early 1990s very few new protocols have gained widespread adoption. <a href="2017/05/27/crypto-tokens-a-breakthrough-in-open-network-design">Cryptonetworks fix</a> these problems by providing economics incentives to developers, maintainers, and other network participants in the form of tokens. They are also much more technically robust. For example, they are able to keep state and do arbitrary transformations on that state, something past protocols could never do.</p>
<p>Cryptonetworks use multiple mechanisms to ensure that they stay neutral as they grow, preventing the bait-and-switch of centralized platforms. First, the contract between cryptonetworks and their participants is enforced in open source code. Second, they are kept in check through mechanisms for <a href="https://en.wikipedia.org/wiki/Exit,_Voice,_and_Loyalty">“voice” and “exit.”</a> Participants are given voice through community governance, both “on chain” (via the protocol) and “off chain” (via the social structures around the protocol). Participants can exit either by leaving the network and selling their coins, or in the extreme case by forking the protocol.</p>
<p>In short, cryptonetworks align network participants to work together toward a common goal — the growth of the network and the appreciation of the token. This alignment is one of the main reasons Bitcoin continues to defy skeptics and flourish, even while new cryptonetworks like Ethereum have grown alongside it.</p>
<p>Today’s cryptonetworks suffer from limitations that keep them from seriously challenging centralized incumbents. The most severe limitations are around performance and scalability. The next few years will be about fixing these limitations and building networks that form the infrastructure layer of the crypto stack. After that, most of the energy will turn to building applications on top of that infrastructure.</p>
<h2>How decentralization wins</h2>
<p>It’s one thing to say decentralized networks should win, and another thing to say they will win. Let’s look at specific reasons to be optimistic about this.</p>
<p>Software and web services are built by developers. There are millions of highly skilled developers in the world. Only a small fraction work at large technology companies, and only a small fraction of those work on new product development. Many of the most important software projects in history were created by startups or by communities of independent developers.</p>
<blockquote>
<p>“No matter who you are, most of the smartest people work for someone else.” — <a href="https://en.wikipedia.org/wiki/Joy%27s_law_(management)">Bill Joy</a></p>
</blockquote>
<p>Decentralized networks can win the third era of the internet for the same reason they won the first era: by winning the hearts and minds of entrepreneurs and developers.</p>
<p>An illustrative analogy is the rivalry in the 2000s between Wikipedia and its centralized competitors like Encarta. If you compared the two products in the early 2000s, Encarta was a far better product, with better topic coverage and higher accuracy. But Wikipedia improved at a much faster rate, because it had an active community of volunteer contributors who were attracted to its decentralized, community-governed ethos. By 2005, Wikipedia was the most <a href="https://medium.com/@cdixon/it-s-hard-to-believe-today-but-10-years-ago-wikipedia-was-widely-considered-a-doomed-experiment-a7a0dfd27b8b">popular</a> reference site on the internet. Encarta was shut down in 2009.</p>
<p>The lesson is that when you compare centralized and decentralized systems you need to consider them dynamically, as processes, instead of statically, as rigid products. Centralized systems often start out fully baked, but only get better at the rate at which employees at the sponsoring company improve them. Decentralized systems start out half-baked but, under the right conditions, grow exponentially as they attract new contributors.</p>
<p>In the case of cryptonetworks, there are multiple, compounding feedback loops involving developers of the core protocol, developers of complementary cryptonetworks, developers of 3rd party applications, and service providers who operate the network. These feedback loops are further amplified by the incentives of the associated token, which — as we’ve seen with Bitcoin and Ethereum — can supercharge the rate at which crypto communities develop (and sometimes lead to negative outcomes, as with the excessive electricity consumed by Bitcoin mining).</p>
<p>The question of whether decentralized or centralized systems will win the next era of the internet reduces to who will build the most compelling products, which in turn reduces to who will get more high quality developers and entrepreneurs on their side. GAFA has many advantages, including cash reserves, large user bases, and operational infrastructure. Cryptonetworks have a significantly more attractive value proposition to developers and entrepreneurs. If they can win their hearts and minds, they can mobilize far more resources than GAFA, and rapidly outpace their product development.</p>
<blockquote>
<p>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are linked using hypertext.” — <a href="http://farmerandfarmer.org/mastery/builder.html">Farmer & Farmer</a></p>
</blockquote>
<p>Centralized platforms often come bundled at launch with compelling apps: Facebook had its core socializing features and the iPhone had a number of key apps. Decentralized platforms, by contrast, often launch half-baked and without clear use cases. As a result, they need to go through two phases of product-market fit: 1) product-market fit between the platform and the developers/entrepreneurs who will finish the platform and build out the ecosystem, and 2) product-market fit between the platform/ecosystem and end users. This two-stage process is what causes many people — including sophisticated technologists — to consistently underestimate the potential of decentralized platforms.</p>
<h2>The next era of the internet</h2>
<p>Decentralized networks aren’t a silver bullet that will fix all the problems on the internet. But they offer a much better approach than centralized systems.</p>
<p>Compare the problem of Twitter spam to the problem of email spam. Since Twitter <a href="https://www.theverge.com/2012/8/23/3263481/twitter-api-third-party-developers">closed</a> their network to 3rd-party developers, the only company working on Twitter spam has been Twitter itself. By contrast, there were hundreds of companies that tried to fight email spam, financed by billions of dollars in venture capital and corporate funding. Email spam isn’t solved, but it’s a lot better now, because 3rd parties knew that the <a href="https://en.wikipedia.org/wiki/Simple_Mail_Transfer_Protocol">email protocol</a> was decentralized, so they could build businesses on top of it without worrying about the rules of the game changing later on.</p>
<p>Or consider the problem of network governance. Today, unaccountable groups of employees at large platforms decide how information gets ranked and filtered, which users get promoted and which get banned, and other important governance decisions. In cryptonetworks, these decisions are made by the community, using open and transparent mechanisms. As we know from the offline world, democratic systems aren’t perfect, but they are a lot better than the alternatives.</p>
<p>Centralized platforms have been dominant for so long that many people have forgotten there is a better way to build internet services. Cryptonetworks are a powerful way to develop community-owned networks and provide a level playing field for 3rd-party developers, creators, and businesses. We saw the value of decentralized systems in the first era of the internet. Hopefully we’ll get to see it again in the next.</p>
<p><em>Originally published on <a href="https://medium.com/s/story/why-decentralization-matters-5e3f79f7638e">Medium</a>.</em></p>
]]></content:encoded>
</item>
<item>
<title>Tokens: A Breakthrough in Open Network Design</title>
<link>https://cdixon.org/2017/05/27/crypto-tokens-a-breakthrough-in-open-network-design/</link>
<guid>https://cdixon.org/2017/05/27/crypto-tokens-a-breakthrough-in-open-network-design/</guid>
<pubDate>Sat, 27 May 2017 00:00:00 GMT</pubDate>
<description>It is a wonderful accident of history that the internet and web were created as open platforms that anyone — users, developers, organizations — could access equally. Among other things, ...</description>
<content:encoded><![CDATA[<p>It is a wonderful accident of history that the internet and web were created as open platforms that anyone — users, developers, organizations — could access equally. Among other things, this allowed independent developers to build products that quickly gained widespread adoption. Google started in a Menlo Park garage and Facebook started in a Harvard dorm room. They competed on a level playing field because they were built on decentralized networks governed by open protocols.</p>
<p>Today, tech companies like Facebook, Google, Amazon, and Apple are <a href="https://medium.com/@cdixon/the-internet-economy-fc43f3eff58a">stronger</a> than ever, whether measured by <a href="http://www.visualcapitalist.com/chart-largest-companies-market-cap-15-years/">market cap</a>, share of top mobile apps, or pretty much any other common measure.</p>
<p><img src="images/11LduvqPVCAVsy-rQ2qlhvg.png" alt="Big 4 tech companies dominate smartphone apps (source); while their market caps continue to rise (source)"></p>
<p>These companies also control massive proprietary developer platforms. The dominant operating systems — iOS and Android — charge 30% payment fees and exert heavy influence over app distribution. The dominant social networks tightly restrict access, hindering the ability of third-party developers to scale. Startups and independent developers are increasingly competing from a disadvantaged position.</p>
<p>A potential way to reverse this trend are <a href="http://continuations.com/post/148098927445/crypto-tokens-and-the-coming-age-of-protocol">crypto tokens</a> — a new way to design open networks that arose from the cryptocurrency movement that began with the introduction of Bitcoin in 2008 and accelerated with the introduction of Ethereum in 2014. Tokens are a breakthrough in open network design that enable: 1) the creation of open, decentralized networks that combine the best architectural properties of open and proprietary networks, and 2) new ways to incentivize open network participants, including users, developers, investors, and service providers. By enabling the development of new open networks, tokens could help reverse the centralization of the internet, thereby keeping it accessible, vibrant and fair, and resulting in greater innovation.</p>
<h2>Crypto tokens: unbundling Bitcoin</h2>
<p>Bitcoin was introduced in 2008 with the publication of <a href="https://en.wikipedia.org/wiki/Satoshi_Nakamoto">Satoshi Nakamoto’s</a> landmark <a href="https://bitcoin.org/bitcoin.pdf">paper</a> that proposed a novel, decentralized payment system built on an underlying technology now known as a <a href="https://en.wikipedia.org/wiki/Blockchain">blockchain</a>. Most fans of Bitcoin (including <a href="/2013/12/31/why-im-interested-in-bitcoin/">me</a>) mistakenly thought Bitcoin was solely a breakthrough in financial technology. (It was easy to make this mistake: Nakamoto himself called it a “p2p payment system.”)</p>
<p><img src="images/1MQ68XZTGHQG7E6ut5UimEw.jpeg" alt="2009: Satoshi Nakamoto’s (post) announcing Bitcoin"></p>
<p>In retrospect, Bitcoin was really two innovations: 1) a <a href="https://en.wikipedia.org/wiki/Store_of_value">store of value</a> for people who wanted an alternative to the existing financial system, and 2) a new way to develop open networks. Tokens unbundle the latter innovation from the former, providing a general method for designing and growing open networks.</p>
<p>Networks — computing networks, developer platforms, marketplaces, social networks, etc — have always been a powerful part of the promise of the internet. Tens of thousands of networks have been incubated by developers and entrepreneurs, yet only a very small percentage of those have survived, and most of those were owned and controlled by private companies. The current state of the art of network development is very crude. It often involves raising money (venture capital is a common source of funding) and then spending it on paid marketing and other channels to overcome the “bootstrap problem” — the problem that networks tend to only become useful when they reach a critical mass of users. In the rare cases where networks succeed, the financial returns tend to accrue to the relatively small number of people who own equity in the network. Tokens offer a better way.</p>
<p>Ethereum, introduced in 2014 and launched in 2015, was the first major non-Bitcoin token network. The lead developer, <a href="https://a16z.com/2016/08/28/ethereum/">Vitalik Buterin</a>, had previously tried to create smart contract languages on top of the Bitcoin blockchain. Eventually he realized that (by design, mostly) Bitcoin was too limited, so a new approach was needed.</p>
<p><img src="images/1Crmcqo6mdF1okzHt4Bdp4g.png" alt="2014: Vitalik Buterin’s (forum post) announcing Ethereum"></p>
<p>Ethereum is a network that allows developers to run “smart contracts” — snippets of <a href="https://en.wikipedia.org/wiki/Ethereum#Smart_contracts">code</a> submitted by developers that are executed by a distributed network of computers. Ethereum has a corresponding token called Ether that can be purchased, either to hold for financial purposes or to use by purchasing computing power (known as “<a href="https://ethereum.stackexchange.com/questions/3/what-is-gas-and-transaction-fee-in-ethereum">gas</a>”) on the network. Tokens are also given out to “miners” which are the computers on the decentralized network that execute smart contract code (you can think of miners as playing the role of cloud hosting services like <a href="https://en.wikipedia.org/wiki/Amazon_Web_Services">AWS</a>). Third-party developers can write their own <a href="https://dapps.ethercasts.com/">applications</a> that live on the network, and can charge Ether to generate revenue.</p>
<p>Ethereum is inspiring a new wave of token networks. (It also provided a simple way for new token networks to launch on top of the Ethereum network, using a standard known as <a href="https://github.com/ethereum/EIPs/issues/20">ERC20</a>). Developers are building token networks for a wide range of use cases, including distributed <a href="http://filecoin.io/">computing</a> <a href="https://golem.network/">platforms</a>, <a href="https://augur.net/">prediction</a> and financial markets, incentivized <a href="https://steem.io/">content creation networks</a>, and <a href="https://basicattentiontoken.org/">attention and advertising networks</a>. Many more networks will be invented and launched in the coming months and years.</p>
<p>Below I walk through the two main benefits of the token model, the first architectural and the second involving incentives.</p>
<h2>Tokens enable the management and financing of open services</h2>
<p>Proponents of open systems never had an effective way to manage and fund operating services, leading to a significant architectural disadvantage compared to their proprietary counterparts. This was particularly evident during the last internet mega-battle between open and closed networks: the social wars of the late 2000s. As Alexis Madrigal recently <a href="https://www.theatlantic.com/technology/archive/2017/05/a-very-brief-history-of-the-last-10-years-in-technology/526767/?utm_source=atltw">wrote</a>, back in 2007 it looked like open networks would dominate going forward:</p>
<blockquote>
<p>In 2007, the web people were triumphant. Sure, the dot-com boom had busted, but empires were being built out of the remnant swivel chairs and fiber optic cables and unemployed developers. Web 2.0 was not just a temporal description, but an ethos. The web would be open. A myriad of services would be built, communicating through APIs, to provide the overall internet experience.</p>
</blockquote>
<p>But with the launch of the iPhone and the rise of smartphones, proprietary networks quickly won out:</p>
<blockquote>
<p>As that world-historical explosion began, a platform war came with it. The Open Web lost out quickly and decisively. By 2013, Americans spent about as much of their time on their phones <a href="http://www.marketingcharts.com/online/smart-device-users-spend-as-much-time-on-facebook-as-the-mobile-web-28422/">looking at Facebook</a> as they did the whole rest of the open web.</p>
</blockquote>
<p>Why did open social protocols get so decisively defeated by proprietary social networks? The rise of smartphones was only part of the story. Some open protocols — like email and the web — survived the transition to the mobile era. Open protocols relating to social networks were high quality and abundant (e.g. <a href="https://en.wikipedia.org/wiki/RSS">RSS</a>, <a href="http://xmlns.com/foaf/spec/">FOAF</a>, <a href="https://en.wikipedia.org/wiki/XHTML_Friends_Network">XFN</a>, <a href="http://openid.net/">OpenID</a>). What the open side lacked was a mechanism for encapsulating software, databases, and protocols together into easy-to-use services.</p>
<p>For example, in 2007, Wired magazine ran an <a href="https://www.wired.com/2007/08/open-social-net/">article</a> in which they tried to create their own social network using open tools:</p>
<blockquote>
<p>For the last couple of weeks, Wired News tried to roll its own Facebook using free web tools and widgets. We came close, but we ultimately failed. We were able to recreate maybe 90 percent of Facebook’s functionality, but not the most important part — a way to link people and declare the nature of the relationship.</p>
</blockquote>
<p>Some developers <a href="http://bradfitz.com/social-graph-problem/">proposed</a> solving this problem by creating a database of social graphs run by a non-profit organization:</p>
<blockquote>
<p><strong>Establish a non-profit and open source software</strong> (with copyrights held by the non-profit) which collects, merges, and redistributes the graphs from all other social network sites into one global aggregated graph. This is then made available to other sites (or users) via both public APIs (for small/casual users) and downloadable data dumps, with an update stream / APIs, to get iterative updates to the graph (for larger users).</p>
</blockquote>
<p>These open schemes required widespread coordination among standards bodies, server operators, app developers, and sponsoring organizations to mimic the functionality that proprietary services could provide all by themselves. As a result, proprietary services were able to create better user experiences and iterate much faster. This led to faster growth, which in turn led to greater investment and revenue, which then fed back into product development and further growth. Thus began a flywheel that drove the meteoric rise of proprietary social networks like Facebook and Twitter.</p>
<p>Had the token model for network development existed back in 2007, the playing field would have been much more level. First, tokens provide a way not only to define a protocol, but to fund the operating expenses required to host it as a service. Bitcoin and Ethereum have tens of thousands of servers around the world (“miners”) that run their networks. They cover the hosting costs with built-in mechanisms that automatically distribute token rewards to computers on the network (“mining rewards”).</p>
<p><img src="images/1-lu1cuJeeDIFPsDpPPo8lw.png" alt="There are over 20,000 Ethereum nodes around the world (source)"></p>
<p>Second, tokens provide a model for creating shared computing resources (<a href="https://medium.com/@FEhrsam/the-dapp-developer-stack-the-blockchain-industry-barometer-8d55ec1c7d4">including</a> databases, compute, and file storage) while keeping the control of those resources decentralized (and without requiring an organization to maintain them). This is the blockchain technology that has been talked about <a href="https://trends.google.com/trends/explore?q=blockchain">so much</a>. Blockchains would have allowed shared social graphs to be stored on a decentralized network. It would have been easy for the Wired author to create an open social network using the tools available today.</p>
<h2>Tokens align incentives among network participants</h2>
<p>Some of the <a href="/2009/09/14/the-inevitable-showdown-between-twitter-and-twitter-apps/">fiercest battles</a> in tech are between <a href="https://en.wikipedia.org/wiki/Complementary_good">complements</a>. There were, for example, hundreds of startups that tried to build businesses on the APIs of social networks only to have the terms change later on, forcing them to pivot or shut down. Microsoft’s battles with complements like Netscape and Intuit are legendary. Battles within ecosystems are so common and drain so much energy that business books are full of frameworks for how one company can squeeze profits from adjacent businesses (e.g. Porter’s <a href="https://en.wikipedia.org/wiki/Porter%27s_five_forces_analysis">five forces</a> model).</p>
<p>Token networks remove this friction by aligning network participants to work together toward a common goal— the growth of the network and the appreciation of the token. This alignment is one of the main reasons Bitcoin continues to defy <a href="https://99bitcoins.com/bitcoinobituaries/">skeptics</a> and flourish, even while new token networks like Ethereum have grown along side it.</p>
<p>Moreover, well-designed token networks include an efficient mechanism to incentivize network participants to overcome the bootstrap problem that bedevils traditional network development. For example, <a href="https://steemit.com/">Steemit</a> is a decentralized Reddit-like token network that makes payments to users who post and upvote articles. When Steemit launched last year, the community was <a href="https://coinreport.net/social-network-steemit-distributes-1-3-million-first-cryptocurrency-payout-users/">pleasantly surprised</a> when they made their first significant payout to users.</p>
<p><img src="images/1mi0v6PNlGnjL9QH-AWZxAA.png" alt="Tokens help overcome the bootstrap problem by adding financial utility when application utility is low"></p>
<p>This in turn led to the appreciation of Steemit tokens, which increased future payouts, leading to a <a href="https://www.usv.com/blog/fat-protocols">virtuous cycle</a> where more users led to more investment, and vice versa. Steemit is still a beta project and has since had mixed results, but was an interesting experiment in how to generalize the mutually reinforcing interaction between users and investors that Bitcoin and Ethereum first demonstrated.</p>
<p>A lot of attention has been paid to token pre-sales (so-called “ICOs”), but they are just one of multiple ways in which the token model innovates on network incentives. A well-designed token network carefully manages the distribution of tokens across all five groups of network participants (users, core developers, third-party developers, investors, service providers) to maximize the growth of the network.</p>
<p>One way to think about the token model is to imagine if the internet and web hadn’t been funded by governments and universities, but instead by a company that raised money by selling off domain names. People could buy domain names either to use them or as an investment (collectively, domain names are worth tens of billions of dollars today). Similarly, domain names could have been given out as rewards to service providers who agreed to run hosting services, and to third-party developers who supported the network. This would have provided an alternative way to finance and accelerate the development of the internet while also aligning the incentives of the various network participants.</p>
<h2>The open network movement</h2>
<p>The cryptocurrency movement is the spiritual heir to previous open computing movements, including the open source software movement led most visibly by Linux, and the open information movement led most visibly by Wikipedia.</p>
<p><img src="images/1U0B5FlpNVXSXeIcqodktLQ.png" alt="1991: Linus Torvalds’ forum (post) announcing Linux; 2001: the first Wikipedia (page)"></p>
<p>Both of these movements were once niche and <a href="https://medium.com/@cdixon/it-s-hard-to-believe-today-but-10-years-ago-wikipedia-was-widely-considered-a-doomed-experiment-a7a0dfd27b8b">controversial</a>. Today Linux is the dominant worldwide operating system, and Wikipedia is the most popular informational website in the world.</p>
<p>Crypto tokens are currently niche and controversial. If present trends continue, they will soon be seen as a breakthrough in the design and development of open networks, combining the societal benefits of open protocols with the financial and architectural benefits of proprietary networks. They are also an extremely promising development for those hoping to keep the internet accessible to entrepreneurs, developers, and other independent creators.</p>
]]></content:encoded>
</item>
<item>
<title>How Aristotle Created the Computer</title>
<link>https://cdixon.org/2017/02/20/aristotle-computer/</link>
<guid>https://cdixon.org/2017/02/20/aristotle-computer/</guid>
<pubDate>Mon, 20 Feb 2017 00:00:00 GMT</pubDate>
<description>The philosophers he influenced set the stage for the technological revolution that remade our world. Originally published by The Atlantic. The history of computers is often told as a history ...</description>
<content:encoded><![CDATA[<h2>The philosophers he influenced set the stage for the technological revolution that remade our world.</h2>
<p><em>Originally published by <a href="https://www.theatlantic.com/technology/archive/2017/03/aristotle-computer/518697/">The Atlantic</a>.</em></p>
<p>The history of computers is often told as a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II. In fact, it is better understood as a history of ideas, mainly ideas that emerged from mathematical logic, an obscure and cult-like discipline that first developed in the 19th century. Mathematical logic was pioneered by philosopher-mathematicians, most notably George Boole and Gottlob Frege, who were themselves inspired by Leibniz’s dream of a universal “concept language,” and the ancient logical system of Aristotle.</p>
<p>Mathematical logic was initially considered a hopelessly abstract subject with no conceivable applications. As one computer scientist <a href="http://bactra.org/notebooks/mathematical-logic.html">commented</a>: “If, in 1901, a talented and sympathetic outsider had been called upon to survey the sciences and name the branch which would be least fruitful in [the] century ahead, his choice might well have settled upon mathematical logic.” And yet, it would provide the foundation for a field that would have more impact on the modern world than any other.</p>
<p>The evolution of computer science from mathematical logic culminated in the 1930s, with two landmark papers: Claude Shannon’s “<a href="http://www.ccapitalia.net/descarga/docs/1938-shannon-analysis-relay-switching-circuits.pdf">A Symbolic Analysis of Switching and Relay Circuits</a>,” and Alan Turing’s “<a href="http://www.dna.caltech.edu/courses/cs129/caltech_restricted/Turing_1936_IBID.pdf">On Computable Numbers, With an Application to the <em>Entscheidungsproblem</em></a>.” In the history of computer science, Shannon and Turing are towering figures, but the importance of the philosophers and logicians who preceded them is frequently overlooked.</p>
<p>A well-known history of computer science describes Shannon’s paper as “possibly the most important, and also the most noted, master’s thesis of the century.” Shannon wrote it as an electrical engineering student at MIT. His adviser, Vannevar Bush, built a prototype computer known as the <a href="http://www.mit.edu/~klund/analyzer/">Differential Analyzer</a> that could rapidly calculate differential equations. The device was mostly mechanical, with subsystems controlled by electrical relays, which were organized in an ad hoc manner as there was not yet a systematic theory underlying circuit design. Shannon’s thesis topic came about when Bush recommended he try to discover such a theory.</p>
<p>Shannon’s paper is in many ways a typical electrical-engineering paper, filled with equations and diagrams of electrical circuits. What is unusual is that the primary reference was a 90-year-old work of mathematical philosophy, George Boole’s <em>The Laws of Thought</em>.</p>
<p>Today, Boole’s name is well known to computer scientists (many programming languages have a basic data type called a Boolean), but in 1938 he was rarely read outside of philosophy departments. Shannon himself encountered Boole’s work in an undergraduate philosophy class. “It just happened that no one else was familiar with both fields at the same time,” he <a href="http://georgeboole.com/boole/legacy/engineering/">commented</a> later.</p>
<p>Boole is often described as a mathematician, but he saw himself as a philosopher, following in the footsteps of Aristotle. The Laws of Thought begins with a description of his goals, to investigate the fundamental laws of the operation of the human mind:</p>
<blockquote>
<p>The design of the following treatise is to investigate the fundamental laws of those operations of the mind by which reasoning is performed; to give expression to them in the symbolical language of a Calculus, and upon this foundation to establish the science of Logic … and, finally, to collect … some probable intimations concerning the nature and constitution of the human mind.</p>
</blockquote>
<p>He then pays tribute to Aristotle, the inventor of logic, and the primary influence on <a href="http://www.gutenberg.org/files/15114/15114-pdf.pdf">his own work</a>:</p>
<blockquote>
<p>In its ancient and scholastic form, indeed, the subject of Logic stands almost exclusively associated with the great name of Aristotle. As it was presented to ancient Greece in the partly technical, partly metaphysical disquisitions of The Organon, such, with scarcely any essential change, it has continued to the present day.</p>
</blockquote>
<p>Trying to improve on the logical work of Aristotle was an intellectually daring move. Aristotle’s logic, presented in his six-part book <em>The Organon</em>, occupied a central place in the scholarly canon for more than 2,000 years. It was widely believed that Aristotle had written almost all there was to say on the topic. The great philosopher Immanuel Kant <a href="https://books.google.com/books?id=WJVYp0C0taYC&pg=PA36&lpg=PA36&dq=unable+to+take+a+single+step+forward,+and+therefore+seems+to+all+appearance+to+be+finished+and+complete&source=bl&ots=W4Lrt9I80J&sig=KpZlOd-Yc9brgTksIJJZcxUD-Mg&hl=en&sa=X&ved=0ahUKEwjeg8i1iLvQAhVH6IMKHTMXDMgQ6AEIHTAA#v=onepage&q=unable%20to%20take%20a%20single%20step%20forward%2C%20and%20therefore%20seems%20to%20all%20appearance%20to%20be%20finished%20and%20complete&f=false">commented</a> that, since Aristotle, logic had been “unable to take a single step forward, and therefore seems to all appearance to be finished and complete.”</p>
<p>Aristotle’s central observation was that arguments were valid or not based on their logical structure, independent of the non-logical words involved. The most famous argument schema he discussed is known as the syllogism:</p>
<ul>
<li>All men are mortal.</li>
<li>Socrates is a man.</li>
<li>Therefore, Socrates is mortal.</li>
</ul>
<p>You can replace “Socrates” with any other object, and “mortal” with any other predicate, and the argument remains valid. The validity of the argument is determined solely by the logical structure. The logical words — “all,” “is,” are,” and “therefore” — are doing all the work.</p>
<p>Aristotle also defined a set of basic axioms from which he derived the rest of his logical system:</p>
<ul>
<li>An object is what it is (Law of Identity)</li>
<li>No statement can be both true and false (Law of Non-contradiction)</li>
<li>Every statement is either true or false (Law of the Excluded Middle)</li>
</ul>
<p>These axioms weren’t meant to describe how people actually think (that would be the realm of psychology), but how an idealized, perfectly rational person ought to think.</p>
<p>Aristotle’s axiomatic method influenced an even more famous book, Euclid’s <em>Elements</em>, which is <a href="https://en.wikipedia.org/wiki/Euclid%27s_Elements">estimated</a> to be second only to the Bible in the number of editions printed.</p>
<p><img src="images/2c8ad9d68.png" alt="A fragment of the Elements (Wikimedia Commons)"></p>
<p>Although ostensibly about geometry, the <em>Elements</em> became a standard textbook for teaching rigorous deductive reasoning. (Abraham Lincoln once said that he learned sound legal argumentation from studying Euclid.) In Euclid’s system, geometric ideas were represented as spatial diagrams. Geometry continued to be practiced this way until René Descartes, in the 1630s, showed that geometry could instead be represented as formulas. His <em>Discourse on Method</em> was the <a href="http://www.storyofmathematics.com/17th_descartes.html">first</a> mathematics text in the West to popularize what is now standard algebraic notation — x, y, z for variables, a, b, c for known quantities, and so on.</p>
<p>Descartes’s algebra allowed mathematicians to move beyond spatial intuitions to manipulate symbols using precisely defined formal rules. This shifted the dominant mode of mathematics from diagrams to formulas, leading to, among other things, the development of calculus, invented roughly 30 years after Descartes by, independently, Isaac Newton and Gottfried Leibniz.</p>
<p>Boole’s goal was to do for Aristotelean logic what Descartes had done for Euclidean geometry: free it from the limits of human intuition by giving it a precise algebraic notation. To give a simple example, when Aristotle wrote:</p>
<p>All men are mortal.</p>
<p>Boole replaced the words “men” and “mortal” with variables, and the logical words “all” and “are” with arithmetical operators:</p>
<p><em>x = x * y</em></p>
<p>Which could be interpreted as “Everything in the set <em>x</em> is also in the set <em>y</em>.”</p>
<p>The <em>Laws of Thought</em> created a new scholarly field—mathematical logic—which in the following years became one of the most active areas of research for mathematicians and philosophers. Bertrand Russell called the <em>Laws of Thought</em> “the work in which pure mathematics was discovered.”</p>
<p>Shannon’s insight was that Boole’s system could be mapped directly onto electrical circuits. At the time, electrical circuits had no systematic theory governing their design. Shannon realized that the right theory would be “exactly analogous to the calculus of propositions used in the symbolic study of logic.”</p>
<p>He showed the correspondence between electrical circuits and Boolean operations in a simple chart:</p>
<p><img src="images/99df968e4.png" alt="Shannon’s mapping from electrical circuits to symbolic logic (University of Virginia)"></p>
<p>This correspondence allowed computer scientists to import decades of work in logic and mathematics by Boole and subsequent logicians. In the second half of his paper, Shannon showed how Boolean logic could be used to create a circuit for adding two binary digits.</p>
<p>By stringing these adder circuits together, arbitrarily complex arithmetical operations could be constructed. These circuits would become the basic building blocks of what are now known as <a href="https://en.wikipedia.org/wiki/Arithmetic_logic_unit">arithmetical logic units</a>, a key component in modern computers.</p>
<p><img src="images/2b88e5a1a.png" alt="Shannon’s adder circuit (University of Virginia)"></p>
<p>Another way to characterize Shannon’s achievement is that he was first to distinguish between the logical and the physical layer of computers. (This distinction has become so fundamental to computer science that it might seem surprising to modern readers how insightful it was at the time—a reminder of the adage that “the philosophy of one century is the common sense of the next.”)</p>
<p>Since Shannon’s paper, a vast amount of progress has been made on the physical layer of computers, including the invention of the transistor in 1947 by William Shockley and his colleagues at Bell Labs. Transistors are dramatically improved versions of Shannon’s electrical relays — the best known way to physically encode Boolean operations. Over the next 70 years, the semiconductor industry packed more and more transistors into smaller spaces. A 2016 iPhone <a href="http://www.macrumors.com/2016/09/12/cpu-improvements-iphone-7-apple-watch/">has</a> about 3.3 billion transistors, each one a “relay switch” like those pictured in Shannon’s diagrams.</p>
<p>While Shannon showed how to map logic onto the physical world, Turing showed how to design computers in the language of mathematical logic. When Turing wrote his paper, in 1936, he was trying to solve “the decision problem,” first identified by the mathematician David Hilbert, who asked whether there was an algorithm that could determine whether an arbitrary mathematical statement is true or false. In contrast to Shannon’s paper, Turing’s paper is highly technical. Its primary historical significance lies not in its answer to the decision problem, but in the template for computer design it provided along the way.</p>
<p>Turing was working in a tradition stretching back to Gottfried Leibniz, the philosophical giant who developed calculus independently of Newton. Among Leibniz’s many contributions to modern thought, one of the most intriguing was the idea of a new language he called the “<a href="https://en.wikipedia.org/wiki/Characteristica_universalis">universal characteristic</a>” that, he imagined, could represent all possible mathematical and scientific knowledge. Inspired in part by the 13th-century religious philosopher <a href="https://en.wikipedia.org/wiki/Ramon_Llull">Ramon Llull</a>, Leibniz postulated that the language would be ideographic like Egyptian hieroglyphics, except characters would correspond to “atomic” concepts of math and science. He argued this language would give humankind an “instrument” that could enhance human reason “to a far greater extent than optical instruments” like the microscope and telescope.</p>
<p>He also <a href="http://publicdomainreview.org/2016/11/10/let-us-calculate-leibniz-llull-and-computational-imagination/">imagined</a> a machine that could process the language, which he called the calculus ratiocinator.</p>
<blockquote>
<p>If controversies were to arise, there would be no more need of disputation between two philosophers than between two accountants. For it would suffice to take their pencils in their hands, and say to each other: Calculemus—Let us calculate.</p>
</blockquote>
<p>Leibniz didn’t get the opportunity to develop his universal language or the corresponding machine (although he did invent a relatively simple calculating machine, the <a href="https://en.wikipedia.org/wiki/Stepped_reckoner">stepped reckoner</a>). The first credible attempt to realize Leibniz’s dream came in 1879, when the German philosopher Gottlob Frege published his landmark logic treatise <em><a href="https://en.wikipedia.org/wiki/Begriffsschrift">Begriffsschrift</a></em>. Inspired by Boole’s attempt to improve Aristotle’s logic, Frege developed a much more advanced logical system. The logic taught in philosophy and computer-science classes today—first-order or predicate logic—is only a slight modification of Frege’s system.</p>
<p>Frege is generally considered one of the most important philosophers of the 19th century. Among other things, he is credited with catalyzing what noted philosopher Richard Rorty called the “<a href="https://en.wikipedia.org/wiki/Linguistic_turn">linguistic turn</a>” in philosophy. As Enlightenment philosophy was obsessed with questions of knowledge, philosophy after Frege became obsessed with questions of language. His disciples included two of the most important philosophers of the 20th century—Bertrand Russell and Ludwig Wittgenstein.</p>
<p>The major innovation of Frege’s logic is that it much more accurately represented the logical structure of ordinary language. Among other things, Frege was the first to use quantifiers (“for every,” “there exists”) and to separate objects from predicates. He was also the first to develop what today are fundamental concepts in computer science like recursive functions and variables with scope and binding.</p>
<p>Frege’s formal language — what he called his “concept-script” — is made up of meaningless symbols that are manipulated by well-defined rules. The language is only given meaning by an interpretation, which is specified separately (this distinction would later come to be called syntax versus semantics). This turned logic into what the eminent computer scientists Allan Newell and Herbert Simon called “the symbol game,” “played with meaningless tokens according to certain purely syntactic rules.”</p>
<blockquote>
<p>All meaning had been purged. One had a mechanical system about which various things could be proved. Thus progress was first made by walking away from all that seemed relevant to meaning and human symbols.</p>
</blockquote>
<p>As Bertrand Russell famously quipped: “Mathematics may be defined as the subject in which we never know what we are talking about, nor whether what we are saying is true.”</p>
<p>An unexpected consequence of Frege’s work was the discovery of weaknesses in the foundations of mathematics. For example, Euclid’s <em>Elements</em> — considered the gold standard of logical rigor for thousands of years — turned out to be full of logical mistakes. Because Euclid used ordinary words like “line” and “point,” he — and centuries of readers — deceived themselves into making assumptions about sentences that contained those words. To give one relatively simple example, in ordinary usage, the word “line” implies that if you are given three distinct points on a line, one point must be between the other two. But when you define “line” using formal logic, it turns out “between-ness” also needs to be defined—something Euclid overlooked. Formal logic makes gaps like this easy to spot.</p>
<p>This realization created a <a href="https://en.wikipedia.org/wiki/Foundations_of_mathematics#Foundational_crisis">crisis</a> in the foundation of mathematics. If the <em>Elements</em> — the bible of mathematics — contained logical mistakes, what other fields of mathematics did too? What about sciences like physics that were built on top of mathematics?</p>
<p>The good news is that the same logical methods used to uncover these errors could also be used to correct them. Mathematicians started rebuilding the foundations of mathematics from the bottom up. In 1889, Giuseppe Peano <a href="https://en.wikipedia.org/wiki/Peano_axioms">developed</a> axioms for arithmetic, and in 1899, David Hilbert <a href="https://en.wikipedia.org/wiki/Hilbert%27s_axioms">did</a> the same for geometry. Hilbert also outlined a program to formalize the remainder of mathematics, with specific requirements that any such attempt should satisfy, including:</p>
<ul>
<li><em>Completeness</em>: There should be a proof that all true mathematical statements can be proved in the formal system.</li>
<li><em>Decidability</em>: There should be an algorithm for deciding the truth or falsity of any mathematical statement. (This is the “<em>Entscheidungsproblem</em>” or “decision problem” referenced in Turing’s paper.)</li>
</ul>
<p>Rebuilding mathematics in a way that satisfied these requirements became known as Hilbert’s program. Up through the 1930s, this was the focus of a core group of logicians including Hilbert, Russell, Kurt Gödel, John Von Neumann, Alonzo Church, and, of course, Alan Turing.</p>
<p>Hilbert’s program proceeded on at least two fronts. On the first front, logicians created logical systems that tried to prove Hilbert’s requirements either satisfiable or not.</p>
<p>On the second front, mathematicians used logical concepts to rebuild classical mathematics. For example, Peano’s system for arithmetic starts with a simple function called the successor function which increases any number by one. He uses the successor function to recursively define <a href="https://en.wikipedia.org/wiki/Peano_axioms#Addition">addition</a>, uses addition to recursively define <a href="https://en.wikipedia.org/wiki/Peano_axioms#Multiplication">multiplication</a>, and so on, until all the operations of number theory are defined. He then uses those definitions, along with formal logic, to prove theorems about arithmetic.</p>
<p>The historian Thomas Kuhn once observed that “in science, novelty emerges only with difficulty.” Logic in the era of Hilbert’s program was a tumultuous process of creation and destruction. One logician would build up an elaborate system and another would tear it down.</p>
<p>The favored tool of destruction was the construction of self-referential, paradoxical statements that showed the axioms from which they were derived to be inconsistent. A simple form of this “liar’s paradox” is the sentence:</p>
<p>This sentence is false.</p>
<p>If it is true then it is false, and if it is false then it is true, leading to an endless loop of self-contradiction.</p>
<p>Russell made the first notable use of the liar’s paradox in mathematical logic. He showed that Frege’s system allowed self-contradicting sets to be derived:</p>
<blockquote>
<p>Let <em>R</em> be the set of all sets that are not members of themselves. If <em>R</em> is not a member of itself, then its definition dictates that it must contain itself, and if it contains itself, then it contradicts its own definition as the set of all sets that are not members of themselves.</p>
</blockquote>
<p>This became known as Russell’s paradox and was seen as a serious flaw in Frege’s achievement. (Frege himself was shocked by this discovery. He replied to Russell: “Your discovery of the contradiction caused me the greatest surprise and, I would almost say, consternation, since it has shaken the basis on which I intended to build my arithmetic.”)</p>
<p>Russell and his colleague Alfred North Whitehead put forth the most ambitious attempt to complete Hilbert’s program with the <em>Principia Mathematica</em>, published in three volumes between 1910 and 1913. The <em>Principia’s</em> method was so detailed that it took over 300 pages to get to the proof that 1+1=2.</p>
<p>Russell and Whitehead tried to resolve Frege’s paradox by introducing what they called type theory. The idea was to partition formal languages into multiple levels or types. Each level could make reference to levels below, but not to their own or higher levels. This resolved self-referential paradoxes by, in effect, banning self-reference. (This solution was not popular with logicians, but it did influence computer science — most modern computer languages have features inspired by type theory.)</p>
<p>Self-referential paradoxes ultimately showed that Hilbert’s program could never be successful. The first blow came in 1931, when Gödel published his now famous incompleteness theorem, which proved that any consistent logical system powerful enough to encompass arithmetic must also contain statements that are true but cannot be proven to be true. (Gödel’s incompleteness theorem is one of the few logical results that has been broadly popularized, thanks to books like <a href="https://en.wikipedia.org/wiki/G%C3%B6del,_Escher,_Bach">Gödel, Escher, Bach</a> and <a href="https://www.amazon.com/dp/B00ARGXG7Q/ref=dp-kindle-redirect?_encoding=UTF8&btkr=1">The Emperor’s New Mind</a>).</p>
<p>The final blow came when Turing and Alonzo Church independently proved that no algorithm could exist that determined whether an arbitrary mathematical statement was true or false. (Church did this by inventing an entirely different system called the <a href="https://en.wikipedia.org/wiki/Lambda_calculus">lambda calculus</a>, which would later inspire computer languages like <a href="https://en.wikipedia.org/wiki/Lisp_%28programming_language%29">Lisp</a>.) The answer to the decision problem was negative.</p>
<p>Turing’s key insight came in the first section of his famous 1936 paper, “On Computable Numbers, With an Application to the <em>Entscheidungsproblem</em>.” In order to rigorously formulate the decision problem (the “<em>Entscheidungsproblem</em>”), Turing first created a mathematical model of what it means to be a computer (today, machines that fit this model are known as “universal Turing machines”). As the logician Martin Davis describes it:</p>
<blockquote>
<p>Turing knew that an algorithm is typically specified by a list of rules that a person can follow in a precise mechanical manner, like a recipe in a cookbook. He was able to show that such a person could be limited to a few extremely simple basic actions without changing the final outcome of the computation.</p>
<p>Then, by proving that no machine performing only those basic actions could determine whether or not a given proposed conclusion follows from given premises using Frege’s rules, he was able to conclude that no algorithm for the Entscheidungsproblem exists.</p>
<p>As a byproduct, he found a mathematical model of an all-purpose computing machine.</p>
</blockquote>
<p>Next, Turing showed how a program could be stored inside a computer alongside the data upon which it operates. In today’s vocabulary, we’d say that he invented the “stored-program” architecture that underlies most modern computers:</p>
<blockquote>
<p>Before Turing, the general supposition was that in dealing with such machines the three categories — machine, program, and data — were entirely separate entities. The machine was a physical object; today we would call it hardware. The program was the plan for doing a computation, perhaps embodied in punched cards or connections of cables in a plugboard. Finally, the data was the numerical input. Turing’s universal machine showed that the distinctness of these three categories is an illusion.</p>
</blockquote>
<p>This was the first rigorous demonstration that any computing logic that could be encoded in hardware could also be encoded in software. The architecture Turing described was later dubbed the “Von Neumann architecture” — but modern historians generally agree it came from Turing, as, apparently, did Von Neumann <a href="https://en.wikipedia.org/wiki/Alan_Turing#cite_note-36">himself</a>.</p>
<p>Although, on a technical level, Hilbert’s program was a failure, the efforts along the way demonstrated that large swaths of mathematics could be constructed from logic. And after Shannon and Turing’s insights—showing the connections between electronics, logic and computing—it was now possible to export this new conceptual machinery over to computer design.</p>
<p>During World War II, this theoretical work was put into practice, when government labs conscripted a number of elite logicians. Von Neumann joined the atomic bomb project at Los Alamos, where he worked on computer design to support physics research. In 1945, he wrote the <a href="http://www.virtualtravelog.net/wp/wp-content/media/2003-08-TheFirstDraft.pdf">specification</a> of the EDVAC—the first stored-program, logic-based computer—which is generally considered the definitive source guide for modern computer design.</p>
<p>Turing joined a secret unit at Bletchley Park, northwest of London, where he helped design computers that were instrumental in breaking German codes. His most enduring contribution to practical computer design was his specification of the ACE, or Automatic Computing Engine.</p>
<p>As the first computers to be based on Boolean logic and stored-program architectures, the ACE and the EDVAC were similar in many ways. But they also had interesting differences, some of which foreshadowed modern debates in computer design. Von Neumann’s favored designs were similar to modern CISC (“complex”) processors, baking rich functionality into hardware. Turing’s design was more like modern RISC (“reduced”) processors, minimizing hardware complexity and pushing more work to software.</p>
<p>Von Neumann thought computer programming would be a tedious, clerical job. Turing, by contrast, said computer programming “should be very fascinating. There need be no real danger of it ever becoming a drudge, for any processes that are quite mechanical may be turned over to the machine itself.”</p>
<p>Since the 1940s, computer programming has become significantly more sophisticated. One thing that hasn’t changed is that it still primarily consists of programmers specifying rules for computers to follow. In philosophical terms, we’d say that computer programming has followed in the tradition of deductive logic, the branch of logic discussed above, which deals with the manipulation of symbols according to formal rules.</p>
<p>In the past decade or so, programming has started to change with the growing popularity of machine learning, which involves creating frameworks for machines to learn via statistical inference. This has brought programming closer to the other main branch of logic, inductive logic, which deals with inferring rules from specific instances.</p>
<p>Today’s most promising machine learning techniques use neural networks, which were first <a href="http://www.cse.chalmers.se/~coquand/AUTOMATA/mcp.pdf">invented</a> in 1940s by Warren McCulloch and Walter Pitts, whose idea was to develop a calculus for neurons that could, like Boolean logic, be used to construct computer circuits. Neural networks remained esoteric until decades later when they were combined with statistical techniques, which allowed them to improve as they were fed more data. Recently, as computers have become increasingly adept at handling large data sets, these techniques have produced remarkable results. Programming in the future will likely mean exposing neural networks to the world and letting them learn.</p>
<p>This would be a fitting second act to the story of computers. Logic began as a way to understand the laws of thought. It then helped create machines that could reason according to the rules of deductive logic. Today, deductive and inductive logic are being combined to create machines that both reason and learn. What began, in Boole’s words, with an investigation “concerning the nature and constitution of the human mind,” could result in the creation of new minds—artificial minds—that might someday match or even exceed our own.</p>
]]></content:encoded>
</item>
<item>
<title>Gadgets and Computers</title>
<link>https://cdixon.org/2017/01/16/gadgets-and-computers/</link>
<guid>https://cdixon.org/2017/01/16/gadgets-and-computers/</guid>
<pubDate>Mon, 16 Jan 2017 00:00:00 GMT</pubDate>
<description>From Benedict Evans’ Cars as Feature Phones: This is a common theme in many classes of device: you start with a product that has a few electronic functions added, and ...</description>
<content:encoded><![CDATA[<p>From Benedict Evans’ <a href="http://ben-evans.com/benedictevans/2017/01/10/cars-as-featurephones">Cars as Feature Phones</a>:</p>
<blockquote>
<p>This is a common theme in many classes of device: you start with a product that has a few electronic functions added, and then those functions are delivered with chips, and perhaps they gain an interface and then a screen, and more and more functions (and probably multi-function buttons) — and then, somehow, you’ve built a little weird custom computer without actually meaning to, and all the little silos of features and functions become unmanageable, both at an interface level and also at a fundamental engineering level, and the whole thing gets replaced by a real computer with a real software platform. And this new computer is almost certainly made by a different company.
You could see this problem very clearly at Motorola, which developed as many as two dozen ‘operating systems’ — for phones, pagers, satellite phones, car-control, industrial devices, chip evaluation boards and so on and so on, and picked them for each device out of a metaphorical parts bin just as you’d choose a sensor or battery or any other component. And boy, they really knew how to write operating systems — they had dozens! With, probably, ‘<a href="https://www.technologyreview.com/s/508231/many-cars-have-a-hundred-million-lines-of-code/">millions of lines of code</a>’. This was exactly the right approach in 1995, but in 2005, again, the whole thing collapsed under its own weight, because they needed software as a platform rather than as a one-off component, and instead <a href="http://www.theregister.co.uk/Print/2012/11/29/rockman_on_motorola/">they had a mess</a>.</p>
</blockquote>
<p>The iPhone was the first mainstream cell phone that was also a proper computer. It had a full-fledged operating system and a (mostly) open developer platform. We are likely seeing the same pattern play out across the <a href="https://medium.com/software-is-eating-the-world/what-s-next-in-computing-e54b870b80cc#.bmdmkoc13">next generation of computers</a>: not only cars, but drones, IoT devices, wearables, etc. In the beginning, hardware-focused companies make gadgets with ever increasing laundry lists of features. Then a company with strong software expertise (often a new market entrant) comes along that replaces these feature-packed gadgets with full-fledged computers. These computers have proper (usually Unix-like) operating systems, open developer platforms, and streamlined user interfaces (increasingly, powered by AI).</p>
<p>This process takes time to play out. Apple waited more than a decade from the initial popularity of cell phones to the release of the first iPhone. And sometimes you don’t know the significance of a new computing device until many years later. It wasn’t obvious until around 2012 that iOS and Android smartphones would become the dominant form of computing (recall Facebook’s “<a href="https://techcrunch.com/2012/10/19/facebook-mobile-first/">pivot to mobile</a>” in 2012). Some people (including me) believe we’ve already entered the “computer phase” of consumer IoT with voice assistants like Alexa, but it will probably take years before we understand the enduring mainstream appeal of these devices.</p>
]]></content:encoded>
</item>
<item>
<title>As [Edwin] Land ultimately recognized, the adoption of his [polarized headlight] system was fatally…</title>
<link>https://cdixon.org/2016/09/25/as-edwin-land-ultimately-recognized-the-adoption-of-his-polarized-headlight-system-was-fatally/</link>
<guid>https://cdixon.org/2016/09/25/as-edwin-land-ultimately-recognized-the-adoption-of-his-polarized-headlight-system-was-fatally/</guid>
<pubDate>Sun, 25 Sep 2016 00:00:00 GMT</pubDate>
<description>As [Edwin] Land ultimately recognized, the adoption of his [polarized headlight] system was fatally hampered by the fact that there was no competitive advantage for any car company in using ...</description>
<content:encoded><![CDATA[<p>As [Edwin] Land ultimately recognized, the adoption of his [polarized headlight] system was fatally hampered by the fact that there was no competitive advantage for any car company in using it first. Since all cars needed to incorporate the technology as simultaneously as possible, it was either going to be all, either voluntarily or as directed by the government, or none. No state or federal governmental agency ever stepped in to direct the adoption of the technology in the way that seat belts would be required decades later. Herbert Nichols, a journalist with the Christian Science Monitor who had followed the story, believed that the industry killed the idea even though the demonstrations clearly showed that the system worked. According to Nichols, the industry concluded that it “just didn’t need anything to sell automobiles. They realized they could sell all the automobiles they could make.” Thus, with no economic or competitive incentive, why bother with a system that clearly added costs and admittedly presented implementation issues? After more than two decades, Land reluctantly gave up the fight.</p>
<p><strong>But he learned one very important lesson. “I knew then that I would never go into a commercial field that put a barrier between us and the customer.” Rather than deal with other companies as intermediaries, he would market his innovative products directly to the public. He believed “that the role of industry is to sense a deep human need, then bring science and technology to bear on filling that need. Any market already existing is inherently boring and dull.” Land, like Steve Jobs many decades later, believed that his company should “give people products they do not even know they want.” Fortunately, he already had such a product in mind.</strong></p>
<p>— <em><a href="https://www.amazon.com/dp/B00OHRYYFO/">A Triumph of Genius: Edwin Land, Polaroid, and the Kodak Patent War</a></em></p>
]]></content:encoded>
</item>
<item>
<title>Eleven Reasons To Be Excited About The Future of Technology</title>
<link>https://cdixon.org/2016/08/18/eleven-reasons-to-be-excited-about-the-future-of-technology/</link>
<guid>https://cdixon.org/2016/08/18/eleven-reasons-to-be-excited-about-the-future-of-technology/</guid>
<pubDate>Thu, 18 Aug 2016 00:00:00 GMT</pubDate>
<description>“The strongest force propelling human progress has been the swift advance and wide diffusion of technology.” — The Economist In the year 1820, a person could expect to live less ...</description>
<content:encoded><![CDATA[<blockquote>
<p>“The strongest force propelling human progress has been the swift advance and wide diffusion of technology.” — <a href="http://www.economist.com/node/841842">The Economist</a></p>
</blockquote>
<p>In the year 1820, a person could <a href="https://ourworldindata.org/life-expectancy/">expect to live</a> less than 35 years, 94% of the global population <a href="https://ourworldindata.org/world-poverty/">lived in extreme poverty</a>, and less that 20% of the population was literate. Today, human life expectancy is over 70 years, less that 10% of the global population lives in extreme poverty, and <a href="http://www.oecd.org/statistics/How-was-life.pdf">over 80% of people</a> are literate. These improvements are due mainly to advances in technology, beginning in the industrial age and continuing today in the information age.</p>
<p>There are many exciting new technologies that will continue to transform the world and improve human welfare. Here are eleven of them.</p>
<h2>1. Self-Driving Cars</h2>
<p>Self-driving cars exist today that are safer than human-driven cars in most driving conditions. Over the next 3–5 years they‘ll get even safer, and will begin to go mainstream.</p>
<p><img src="images/1_HfoJs9tCyyr6VeLvD45wyQ.gif" alt=""></p>
<p>The <a href="http://www.who.int/mediacentre/factsheets/fs358/en/">World Health Organization estimates</a> that 1.25 million people die from car-related injuries per year. Half of the deaths are pedestrians, bicyclists, and motorcyclists hit by cars. Cars are the leading cause of death for people ages 15–29 years old.</p>
<p><img src="images/1_SNGdeK4GNUhjL6wlh7sfJw.png" alt=""></p>
<p>Just as cars reshaped the world in the 20th century, so will self-driving cars in the 21st century. In most cities, <a href="http://oldurbanist.blogspot.com.es/2011/12/we-are-25-looking-at-street-area.html">between 20–30%</a> of usable space is taken up by parking spaces, and most cars are parked <a href="http://www.reinventingparking.org/2013/02/cars-are-parked-95-of-time-lets-check.html">about 95%</a> of the time. Self-driving cars will be in almost continuous use (most likely hailed from a smartphone app), thereby dramatically reducing the need for parking. Cars will communicate with one another to avoid accidents and traffic jams, and riders will be able to spend commuting time on other activities like work, education, and socializing.</p>
<p><img src="images/1_k6w2wkkREpVeu9_cS2xxtg.png" alt="Source: Tech Insider"></p>
<h2>2. Clean Energy</h2>
<p>Attempts to fight climate change by reducing the demand for energy <a href="https://en.wikipedia.org/wiki/World_energy_consumption">haven’t worked</a>. Fortunately, scientists, engineers, and entrepreneurs have been working hard on the supply side to make clean energy convenient and cost-effective.</p>
<p>Due to steady technological and manufacturing advances, the price of solar cells has <a href="http://www.saskwind.ca/wind-cost-decline/">dropped 99.5% since 1977</a>. Solar will soon be more cost efficient than fossil fuels. The cost of wind energy has also dropped to an all-time low, and in the last decade represented about a <a href="http://energy.gov/articles/top-10-things-you-didnt-know-about-wind-power">third of newly installed</a> US energy capacity.</p>
<p>Forward thinking organizations are taking advantage of this. For example, in India there is an initiative to convert airports to self-sustaining clean energy.</p>
<p><img src="images/1_idAW1ONI_iIeevzPaUv-pg.png" alt="Airport in Kochi, India (source: Clean Technica)"></p>
<p>Tesla is making high-performance, affordable electric cars, and <a href="http://www.treehugger.com/cars/tesla-built-858-new-charging-stations-us-over-past-12-months.html">installing</a> electric charging stations <a href="http://mashable.com/2016/04/01/tesla-supercharger-expansion/#v93tzyDFl5qR">worldwide</a>.</p>
<p><img src="images/1_YwcTRiWETVn4aXiZhEJtcg.png" alt="Tesla Model 3 and US supercharger locations"></p>
<p>There are hopeful signs that clean energy could soon be reaching a tipping point. For example, in Japan, there are now more electric charging stations than gas stations.</p>
<p><img src="images/1_RNmY6abYWA2n2W6EgP3lcA.png" alt="Source: The Guardian"></p>
<p>And Germany produces so much renewable energy, it sometimes produces even more than it can use.</p>
<p><img src="images/1_wETYiSDThJ5fQYIVWuw8aA.png" alt="Source: Time Magazine"></p>
<h2>3. Virtual and Augmented Reality</h2>
<p>Computer processors only recently became fast enough to power comfortable and convincing virtual and augmented reality experiences. Companies like Facebook, Google, Apple, and Microsoft are investing billions of dollars to make VR and AR more immersive, comfortable, and affordable.</p>
<p><img src="images/1_6cmd8P-bPYRU1olrJHsvfw.gif" alt="Toybox demo from Oculus"></p>
<p>People sometimes think VR and AR will be used only for gaming, but over time they will be used for all sorts of activities. For example, we’ll use them to manipulate 3-D objects:</p>
<p><img src="images/1_q_pqQCTcTETf4G-ARUm00A.jpeg" alt="Augmented reality computer interface (from Iron Man)"></p>
<p>To meet with friends and colleagues from around the world:</p>
<p><img src="images/1_MJcHcqCWEzGxDIVDGpcHcA.jpeg" alt="Augmented reality teleconference (from The Kingsman)"></p>
<p>And even for medical applications, like treating phobias or helping rehabilitate paralysis victims:</p>
<p><img src="images/1_q_J7Ql2iVfdDYc5t6hM98Q.png" alt="Source: New Scientist"></p>
<p>VR and AR have been dreamed about by science fiction fans for decades. In the next few years, they’ll finally become a mainstream reality.</p>
<h2>4. Drones and Flying Cars</h2>
<blockquote>
<p>“Roads? Where we’re going we don’t need… roads.” — Dr. Emmet Brown</p>
</blockquote>
<p>GPS started out as a military technology but is now used to hail taxis, get mapping directions, and hunt Pokémon. Likewise, drones started out as a military technology, but are increasingly being used for a wide range of consumer and commercial applications.</p>
<p>For example, drones are being used to inspect critical infrastructure like bridges and power lines, to survey areas struck by natural disasters, and many other creative uses like fighting animal poaching.</p>
<p><img src="images/1_hLhAdWXECMyNLwrHfad6pA.png" alt="Source: NBC News"></p>
<p>Amazon and Google are building drones to deliver household items.</p>
<p><img src="images/1_s1eQciCtoaD_AaovzJouAA.gif" alt="Amazon delivery drone"></p>
<p>The startup <a href="http://flyzipline.com/product/">Zipline</a> uses drones to deliver medical supplies to remote villages that can’t be accessed by roads.</p>
<p><img src="images/1_BDepNtZOTWXNOi5F4Dk3Dg.png" alt="Source: The Verge"></p>
<p>There is also a new wave of startups working on flying cars (including <a href="http://www.bloomberg.com/news/articles/2016-06-09/welcome-to-larry-page-s-secret-flying-car-factories">two</a> funded by the cofounder of Google, Larry Page).</p>
<p><img src="images/1_FJyVIp3MI_k7mVM5obpSsA.png" alt="The Terrafugia TF-X flying car (source)"></p>
<p>Flying cars use the same advanced technology used in drones but are large enough to carry people. Due to advances in materials, batteries, and software, flying cars will be significantly more affordable and convenient than today’s planes and helicopters.</p>
<h2>5. Artificial Intelligence</h2>
<p><img src="images/1_I2dRn7D8ZZM7nI2IvvMFDw.jpeg" alt=""></p>
<blockquote>
<p>‘’It may be a hundred years before a computer beats humans at Go — maybe even longer.” — <a href="http://www.nytimes.com/1997/07/29/science/to-test-a-powerful-computer-play-an-ancient-game.html?pagewanted=all">New York Times, 1997</a></p>
<p>“Master of Go Board Game Is Walloped by Google Computer Program” —<a href="http://www.nytimes.com/2016/03/10/world/asia/google-alphago-lee-se-dol.html"> New York Times, 2016</a></p>
</blockquote>
<p>Artificial intelligence has made rapid advances in the last decade, due to new algorithms and massive increases in data collection and computing power.</p>
<p>AI can be applied to almost any field. For example, in photography an AI technique called artistic style transfer transforms photographs into the style of a given painter:</p>
<p><img src="images/1_aHFJuj-jhnP4zHY1dD7tRA.png" alt="Source"></p>
<p>Google built an AI system that controls its datacenter power systems, saving hundreds of millions of dollars in energy costs.</p>
<p><img src="images/1_HpTNGOsV1a0PpqjQZNXKEQ.png" alt="Source: Bloomberg"></p>
<p>The broad promise of AI is to liberate people from repetitive mental tasks the same way the industrial revolution liberated people from repetitive physical tasks.</p>
<blockquote>
<p>“If AI can help humans become better chess players, it stands to reason that it can help us become better pilots, better doctors, better judges, better teachers.” — <a href="http://www.wired.com/2014/10/future-of-artificial-intelligence/">Kevin Kelly</a></p>
</blockquote>
<p>Some people worry that AI will destroy jobs. History has shown that while new technology does indeed eliminate jobs, it also creates new and better jobs to replace them. For example, with advent of the personal computer, the number of typographer jobs dropped, but the increase in graphic designer jobs more than made up for it.</p>
<p><img src="images/1_c_lt2s5TuSoOfmPb_Rv46w.png" alt="Source: Harvard Business Review"></p>
<p>It is much easier to imagine jobs that will go away than new jobs that will be created. Today millions of people work as app developers, ride-sharing drivers, drone operators, and social media marketers— jobs that didn’t exist and would have been difficult to even imagine ten years ago.</p>
<h2>6. Pocket Supercomputers for Everyone</h2>
<p><img src="images/1_5tt6F_Cxnf5n7J5v6Lx0Ug.png" alt=""></p>
<p>By 2020, 80% of adults on earth <a href="">will have</a> an internet-connected smartphone. An iPhone 6 has about 2 billion transistors, roughly 625 times more transistors than a 1995 Intel Pentium computer. Today’s smartphones are what used to be considered supercomputers.</p>
<p><img src="images/1_vovBLv3ePKce3dPrU3q9Lg.png" alt="Visitors to the pope (source: Business Insider)"></p>
<p>Internet-connected smartphones give ordinary people abilities that, just a short time ago, were only available to an elite few:</p>
<blockquote>
<p>“Right now, a Masai warrior on a mobile phone in the middle of Kenya has better mobile communications than the president did 25 years ago. If he’s on a smart phone using Google, he has access to more information than the U.S. president did just 15 years ago.” — <a href="http://edition.cnn.com/2012/05/06/opinion/diamandis-abundance-innovation/">Peter Diamandis</a></p>
</blockquote>
<h2>7. Cryptocurrencies and Blockchains</h2>
<blockquote>
<p>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are linked using hypertext.” — <a href="http://farmerandfarmer.org/mastery/builder.html">Farmer & Farmer</a></p>
</blockquote>
<p>Protocols are the plumbing of the internet. Most of the protocols we use today were developed decades ago by academia and government. Since then, protocol development mostly stopped as energy shifted to developing proprietary systems like social networks and messaging apps.</p>
<p>Cryptocurrency and blockchain technologies are <a href="http://avc.com/2016/07/the-golden-age-of-open-protocols/">changing this</a> by providing a new business model for internet protocols. This year alone, <a href="https://medium.com/the-coinbase-blog/app-coins-and-the-dawn-of-the-decentralized-business-model-8b8c951e734f#.2atvp1cxd">hundreds of millions of dollars</a> were raised for a broad range of innovative blockchain-based protocols.</p>
<p>Protocols based on blockchains also have capabilities that previous protocols didn’t. For example, <a href="https://en.wikipedia.org/wiki/Ethereum">Ethereum</a> is a new blockchain-based protocol that can be used to create smart contracts and trusted databases that are immune to corruption and censorship.</p>
<h2>8. High-Quality Online Education</h2>
<p>While college tuition <a href="http://www.cnbc.com/2015/06/16/why-college-costs-are-so-high-and-rising.html">skyrockets</a>, anyone with a smartphone can study almost any topic online, accessing educational content that is mostly free and increasingly high-quality.</p>
<p>Encyclopedia Britannica <a href="http://www.csmonitor.com/Business/Latest-News-Wires/2012/0314/Encyclopaedia-Britannica-After-244-years-in-print-only-digital-copies-sold">used to cost $1,400</a>. Now anyone with a smartphone can instantly access Wikipedia. You used to have to go to school or buy programming books to learn computer programming. Now you can learn from a community of over 40 million programmers at <a href="http://stackoverflow.com">Stack Overflow</a>. YouTube has millions of hours of free tutorials and lectures, many of which are produced by top professors and universities.</p>
<p><img src="images/1_NZTqnqYbOPv6sf7gCVLz8g.png" alt="UC Berkeley Physics on Youtube"></p>
<p>The quality of online education is getting better all the time. For the last 15 years, <a href="http://ocw.mit.edu/index.htm">MIT has been recording lectures</a> and compiling materials that cover over 2000 courses.</p>
<blockquote>
<p>“The idea is simple: to publish all of our course materials online and make them widely available to everyone.” — Dick K.P. Yue, Professor, MIT School of Engineering</p>
</blockquote>
<p>As perhaps the greatest research university in the world, MIT has always been ahead of the trends. Over the next decade, expect many other schools to follow MIT’s lead.</p>
<p><img src="images/1_W-i0QTotXS-K4MU9qbpylQ.png" alt="Source: Futurism"></p>
<h2>9. Better Food through Science</h2>
<p><img src="images/1_O5VQyJRhI2-sHYzZPrHSBQ.png" alt="Source: National Geographic"></p>
<p>Earth is running out of farmable land and fresh water. This is partly because our food production systems are incredibly inefficient. It takes an astounding 1799 gallons of water to produce 1 pound of beef.</p>
<p>Fortunately, a variety of new technologies are being developed to improve our food system.</p>
<p>For example, entrepreneurs are developing new food products that are tasty and nutritious substitutes for traditional foods but far more environmentally friendly. The startup <a href="http://www.impossiblefoods.com/">Impossible Foods</a> invented meat products that look and taste like the real thing but are actually made of plants.</p>
<p><img src="images/1_bUV4b3Xp0mvvdA8dp1hMtA.png" alt="Impossible Food’s plant-based burger (source: Tech Insider)"></p>
<p>Their burger <a href="http://www.impossiblefoods.com/our-burger">uses</a> 95% less land, 74% less water, and produces 87% less greenhouse gas emissions than traditional burgers. Other startups are creating plant-based replacements for <a href="http://ripplefoods.com/">milk</a>, <a href="https://www.hamptoncreek.com/">eggs</a>, and other common foods. <a href="http://soylent.com/">Soylent</a> is a healthy, inexpensive meal replacement that uses advanced engineered <a href="http://terravia.com/Terravia_Sustainability.pdf">ingredients</a> that are much friendlier to the environment than traditional ingredients.</p>
<p>Some of these products are developed using genetic modification, a powerful scientific technique that has been widely mischaracterized as dangerous. According to a <a href="https://www.geneticliteracyproject.org/2015/01/29/pewaaas-study-scientific-consensus-on-gmo-safety-stronger-than-for-global-warming/">study</a> by the Pew Organization, 88% of scientists think genetically modified foods are safe.</p>
<p>Another exciting development in food production is automated indoor farming. Due to advances in solar energy, sensors, lighting, robotics, and artificial intelligence, indoor farms have become viable alternatives to traditional outdoor farms.</p>
<p><img src="images/1_0Jyjlgj1KU2yfBqo7quCLQ.png" alt="Aerofarms indoor farm (Source: New York Times)"></p>
<p>Compared to traditional farms, automated indoor farms use roughly 10 times less water and land. Crops are harvested many more times per year, there is no dependency on weather, and no need to use pesticides.</p>
<h2>10. Computerized Medicine</h2>
<p>Until recently, computers have only been at the periphery of medicine, used primarily for research and record keeping. Today, the combination of computer science and medicine is leading to a variety of breakthroughs.</p>
<p><img src="images/1_IjKrWZdlbB2ksis_Dmia5A.png" alt=""></p>
<p>For example, just fifteen years ago, it cost $3B to sequence a human genome. Today, the cost is about a thousand dollars and continues to drop. Genetic sequencing will soon be a routine part of medicine.</p>
<p>Genetic sequencing generates massive amounts of data that can be analyzed using powerful data analysis software. One application is analyzing <a href="http://a16z.com/2016/06/09/freenome/">blood samples</a> for early detection of cancer. Further genetic analysis can help determine the <a href="http://www.businessinsider.com/super-cheap-genome-sequencing-by-2020-2014-10">best course</a> of treatment.</p>
<p>Another application of computers to medicine is in prosthetic limbs. Here a young girl is using prosthetic hands she controls using her upper-arm muscles:</p>
<p><img src="images/1_jVH1wxchOJ5qJzT46s907A.gif" alt="Source: Open Bionics"></p>
<p>Soon we’ll have the technology to control prothetic limbs with just our thoughts using <a href="http://news.uci.edu/feature/to-walk-again/">brain-to-machine interfaces</a>.</p>
<p>Computers are also becoming increasingly effective at diagnosing diseases. An artificial intelligence system recently diagnosed a rare disease that human doctors failed to diagnose by finding hidden patterns in 20 million cancer records.</p>
<p><img src="images/1_OEgWlj9sp2mCV0PrT9yp8A.png" alt="Source: International Business Times"></p>
<h2>11. A New Space Age</h2>
<p>Since the beginning of the space age in the 1950s, the vast majority of space funding has come from governments. But that funding has been in decline: for example, NASA’s budget <a href="https://en.wikipedia.org/wiki/Budget_of_NASA">dropped</a> from about 4.5% of the federal budget in the 1960s to about 0.5% of the federal budget today.</p>
<p><img src="images/1_paniidrx59zPQjq_q6rUHA.png" alt="Source: Fortune"></p>
<p>The good news is that private space companies have started filling the void. These companies provide a wide range of products and services, including rocket launches, scientific research, communications and imaging satellites, and emerging speculative business models like asteroid mining.</p>
<p>The most famous private space company is Elon Musk’s SpaceX, which successfully sent rockets into space that can return home to be reused.</p>
<p><img src="images/1_5iiaQsTBu1tQ_hTy8fupXg.gif" alt="SpaceX Falcon 9 landing"></p>
<p>Perhaps the most intriguing private space company is <a href="http://www.planetaryresources.com/">Planetary Resources</a>, which is trying to pioneer a new industry: mining minerals from asteroids.</p>
<p><img src="images/1_6zvea6z14lJ6inZQsVBsBA.png" alt="Asteroid mining"></p>
<p>If successful, asteroid mining could lead to a new gold rush in outer space. Like previous gold rushes, this could lead to speculative excess, but also dramatically increased funding for new technologies and infrastructure.</p>
<hr>
<p>These are just a few of the amazing technologies we’ll see developed in the coming decades. 2016 is just the beginning of a new age of wonders. As futurist Kevin Kelly <a href="https://www.linkedin.com/pulse/internet-still-beginning-its-kevin-kelly">says</a>:</p>
<blockquote>
<p>If we could climb into a time machine, journey 30 years into the future, and from that vantage look back to today, we’d realize that most of the greatest products running the lives of citizens in 2050 were not invented until after 2016. People in the future will look at their holodecks and wearable virtual reality contact lenses and downloadable avatars and AI interfaces and say, “Oh, you didn’t really have the internet” — or whatever they’ll call it — “back then.”</p>
<p>So, the truth: Right now, today, in 2016 is the best time to start up. There has never been a better day in the whole history of the world to invent something. There has never been a better time with more opportunities, more openings, lower barriers, higher benefit/ risk ratios, better returns, greater upside than now. Right now, this minute. This is the moment that folks in the future will look back at and say, “Oh, to have been alive and well back then!”</p>
</blockquote>
]]></content:encoded>
</item>
<item>
<title>“Steve Jobs supposedly said, returning to Apple, that his plan was to stay alive and grab onto the…</title>
<link>https://cdixon.org/2016/08/07/steve-jobs-supposedly-said-returning-to-apple-that-his-plan-was-to-stay-alive-and-grab-onto-the/</link>
<guid>https://cdixon.org/2016/08/07/steve-jobs-supposedly-said-returning-to-apple-that-his-plan-was-to-stay-alive-and-grab-onto-the/</guid>
<pubDate>Sun, 07 Aug 2016 00:00:00 GMT</pubDate>
<description>“Steve Jobs supposedly said, returning to Apple, that his plan was to stay alive and grab onto the next big thing — to listen for the footsteps. He tried video, ...</description>
<content:encoded><![CDATA[<p>“Steve Jobs supposedly said, returning to Apple, that his plan was to stay alive and grab onto the next big thing — to listen for the footsteps. He tried video, and a few other things, but he got there in the end. But he might not have.”</p>
<p>From: <a href="http://ben-evans.com/benedictevans/2016/5/2/inevitability-in-technology">Inevitability in technology</a></p>
]]></content:encoded>
</item>
<item>
<title>“Ether is a necessary element — a fuel — for operating the distributed application platform…</title>
<link>https://cdixon.org/2016/08/07/source-ethereum-org/</link>
<guid>https://cdixon.org/2016/08/07/source-ethereum-org/</guid>
<pubDate>Sun, 07 Aug 2016 00:00:00 GMT</pubDate>
<description>“Ether is a necessary element — a fuel — for operating the distributed application platform Ethereum. It is a form of payment made by the clients of the platform to ...</description>
<content:encoded><![CDATA[<p>“Ether is a necessary element — a fuel — for operating the distributed application platform Ethereum. It is a form of payment made by the clients of the platform to the machines executing the requested operations. To put it another way, ether is the incentive ensuring that developers write quality applications (wasteful code costs more), and that the network remains healthy (people are compensated for their contributed resources).</p>
<p>Ether is to be treated as “crypto-fuel”, a token whose purpose is to pay for computation, and is not intended to be used as or considered a currency, asset, share or anything else.”</p>
<p><em>Source: <a href="https://ethereum.org/ether">ethereum.org</a></em></p>
]]></content:encoded>
</item>
<item>
<title>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they…</title>
<link>https://cdixon.org/2016/07/30/if-you-asked-people-in-1989-what-they-needed-to-make-their-life-better-it-was-unlikely-that-they/</link>
<guid>https://cdixon.org/2016/07/30/if-you-asked-people-in-1989-what-they-needed-to-make-their-life-better-it-was-unlikely-that-they/</guid>
<pubDate>Sat, 30 Jul 2016 00:00:00 GMT</pubDate>
<description>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are ...</description>
<content:encoded><![CDATA[<p>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are linked using hypertext.”</p>
<p>— <a href="http://farmerandfarmer.org/mastery/builder.html">Farmer & Farmer
</a></p>
]]></content:encoded>
</item>
<item>
<title>“The typical path of how people respond to life-changing inventions</title>
<link>https://cdixon.org/2016/05/11/the-typical-path-of-how-people-respond-to-life-changing-inventions/</link>
<guid>https://cdixon.org/2016/05/11/the-typical-path-of-how-people-respond-to-life-changing-inventions/</guid>
<pubDate>Wed, 11 May 2016 00:00:00 GMT</pubDate>
<description>I’ve never heard of it. I’ve heard of it but don’t understand it. I understand it, but I don’t see how it’s useful. I see how it could be fun ...</description>
<content:encoded><![CDATA[<ol>
<li>
<p>I’ve never heard of it.</p>
</li>
<li>
<p>I’ve heard of it but don’t understand it.</p>
</li>
<li>
<p>I understand it, but I don’t see how it’s useful.</p>
</li>
<li>
<p>I see how it could be fun for rich people, but not me.</p>
</li>
<li>
<p>I use it, but it’s just a toy.</p>
</li>
<li>
<p>It’s becoming more useful for me.</p>
</li>
<li>
<p>I use it all the time.</p>
</li>
<li>
<p>I could not imagine life without it.</p>
</li>
<li>
<p>Seriously, people lived without it?</p>
</li>
<li>
<p>It’s too powerful and needs to be regulated”</p>
</li>
</ol>
<p><em>Credits:</em></p>
<p><em>#1–#9 by <a href="http://time.com/author/morgan-housel-the-motley-fool/">Morgan Housel</a>, <a href="http://time.com/money/3940273/innovation-isnt-dead/">Time</a></em></p>
<p><em>#10 by <a href="https://twitter.com/peterpeirce/status/616664561068994560?lang=en">@peterpeirce</a></em></p>
]]></content:encoded>
</item>
<item>
<title>Comma.ai</title>
<link>https://cdixon.org/2016/04/02/comma-ai/</link>
<guid>https://cdixon.org/2016/04/02/comma-ai/</guid>
<pubDate>Sat, 02 Apr 2016 00:00:00 GMT</pubDate>
<description>I wrote a blog post last month highlighting some of the exciting trends in the computing industry. One trend I discussed is the rapid progress in a branch of artificial ...</description>
<content:encoded><![CDATA[<p>I wrote a blog post last month highlighting some of the exciting trends in the computing industry. One trend I discussed is the rapid progress in a branch of artificial intelligence called deep learning. Big tech companies are making significant investments in deep learning, but there are also opportunities for startups:</p>
<blockquote>
<p>Many of the papers, <a href="https://code.google.com/archive/p/word2vec/">data</a> <a href="http://image-net.org/download-images">sets</a>, and <a href="https://www.tensorflow.org/">software</a> <a href="http://deeplearning.net/software/theano/">tools</a> related to deep learning have been open sourced. This has had a democratizing effect, allowing individuals and small organizations to build powerful applications. WhatsApp was able to build a global messaging system that <a href="http://www.wired.com/2015/09/whatsapp-serves-900-million-users-50-engineers/">served 900M users with just 50 engineers</a>, compared to the thousands of engineers that were needed for prior generations of messaging systems. This “<a href="https://twitter.com/cdixon/status/473221599189954562">WhatsApp effect</a>” is now happening in AI. Software tools like <a href="http://deeplearning.net/software/theano/">Theano</a> and <a href="https://www.tensorflow.org/">TensorFlow</a>, combined with cloud data centers for training, and inexpensive GPUs for deployment, allow small teams of engineers to build state-of-the-art AI systems.</p>
</blockquote>
<p>You might have seen <a href="http://www.bloomberg.com/features/2015-george-hotz-self-driving-car/">recent press</a> coverage of a software developer named George Hotz who built his own self-driving car.</p>
<p><img src="images/1U00Hr0kDEBcGUf87W4iPcQ.png" alt=""></p>
<p>I first met George a few months ago, and, like a lot of people who had seen the press coverage, I was skeptical. How could someone build such an advanced system all by himself? After spending time with George, my skepticism turned into enthusiasm. I tested his car, and, along with some of my colleagues and friends with AI expertise, dug into the details of the deep learning system he’d developed.</p>
<p><img src="images/1xJP7l8qL4IbNyJnwHYNwdA.gif" alt="Comma’s self-driving car"></p>
<p>I came away convinced that George’s system is a textbook example of the “WhatsApp effect” happening to AI.</p>
<p><img src="images/1d9qMneOOvDP2WHCxgakQkw.png" alt="George with test car #1"></p>
<p>George is certainly brilliant (he’s a <a href="https://en.wikipedia.org/wiki/George_Hotz">famous hacker</a> for a reason), and he’s no longer alone: he’s now working with a small team of machine learning experts. But he’s also riding a wave of exponentially improving hardware, software, and, most importantly, data. The more his system gets used, the more data it collects, and the smarter it becomes.</p>
<p>Today we are announcing that <a href="http://a16z.com/">a16z</a> is leading a $3.1M investment in George’s company, <a href="http://comma.ai/">Comma.ai</a>. This investment will help them continue to build their team (they’re <a href="http://comma.ai/hiring.html">hiring</a>), and bring their technology to market. Expect more announcements from Comma in the next few months. We are very excited to support George and his team on this ambitious project.</p>
]]></content:encoded>
</item>
<item>
<title>The Internet Economy</title>
<link>https://cdixon.org/2016/03/13/the-internet-economy/</link>
<guid>https://cdixon.org/2016/03/13/the-internet-economy/</guid>
<pubDate>Sun, 13 Mar 2016 00:00:00 GMT</pubDate>
<description>We are living in an era of bundling. The big five consumer tech companies — Google, Apple, Facebook, Amazon, and Microsoft — have moved far beyond their original product lines ...</description>
<content:encoded><![CDATA[<p>We are living in an era of bundling. The big five consumer tech companies — Google, Apple, Facebook, Amazon, and Microsoft — have moved far beyond their original product lines into all sorts of hardware, software, and services that overlap and compete with one another. But their revenues and profits still depend heavily on external technologies that are outside of their control. One way to visualize these external dependencies is to consider the path of a typical internet session, from the user to some revenue-generating action, and then (in some cases) back again to the user:</p>
<p><img src="images/1bUnzLePRb7E25uoUEMYQgA.png" alt=""></p>
<p>When evaluating an internet company’s strategic position (the defensibility of its profit <a href="http://www.investopedia.com/terms/e/economicmoat.asp">moat</a>), you need to consider: 1) how the company generates revenue and profits, 2) the loop in its entirety, not just the layers in which the company has products.</p>
<p>For example, it might seem counterintuitive that Amazon is a <a href="/2010/05/22/while-google-fights-on-the-edges-amazon-is-attacking-their-core/">major threat</a> to Google’s core search business. But you can see this by following the money through the loop: a <a href="http://www.wordstream.com/articles/google-earnings">significant portion</a> of Google’s revenue comes from search queries for things that can be bought on Amazon, and the buying experience on Amazon (from initial purchasing intent to consumption/unboxing) is significantly better than the buying experience on most non-Amazon e-commerce sites you find via Google searches. After a while, shoppers learn to skip Google and go straight to Amazon.</p>
<p>Think of the internet economic loop as a model train track. Positions in front of you can redirect traffic around you. Positions after you can build new tracks that bypass you. New technologies come along (which often look <a href="/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy/">toy-like</a> and unthreatening at first) that create entirely new tracks that render the previous tracks obsolete.</p>
<p>There are interesting developments happening at each layer of the loop (and there are many smaller, offshoot loops not depicted in the chart above), but at any given time certain layers are industry flash points. The most prominent recent battle was between mobile devices and operating systems. That battle seems to be over, with Android software and iOS devices having won. Possible future flash points include:</p>
<p><strong>The automation of logistics.</strong> Today’s logistics network is a patchwork of ships, planes, trucks, warehouses, and people. Tomorrow’s network will include significantly more automation, from robotic warehouses to autonomous cars, trucks, drones, and <a href="http://fortune.com/2016/04/06/dispatch-carry-delivery-robot/">delivery bots</a>. This transition will happen in stages, depending on the economics of specific goods and customers, along with geographic and regulatory factors. Amazon of course has a huge advantage in logistics. Google has tried repeatedly to get into logistics with <a href="http://recode.net/2015/08/19/google-express-plans-to-shut-down-its-two-delivery-hubs/">little success</a>. On-demand ride-sharing and delivery startups could play an interesting role here. The logistics layer is critical for e-commerce, which in turn is critical for monetizing search. Amazon’s dominance in logistics gives it a very strong strategic moat as e-commerce continues to take market share from traditional retail.</p>
<p><strong>Web vs apps</strong>. The mobile web <a href="/2014/04/07/the-decline-of-the-mobile-web/">is</a> <a href="http://daringfireball.net/2014/04/rethinking_what_we_mean_by_mobile_web">arguably</a> in decline: users are spending more time on mobile devices, and more time in apps instead of web browsers. Apple has joined the app side of this battle (e.g. allowing ad blockers in Safari, encouraging app install <a href="https://developer.apple.com/library/ios/documentation/AppleApplications/Reference/SafariWebContent/PromotingAppswithAppBanners/PromotingAppswithAppBanners.html">smart banners</a> above websites). Facebook has also taken the app side (e.g. encouraging publishers to use <a href="https://instantarticles.fb.com/">Instant Articles</a> instead of web views). Google of course needs a vibrant web for its search engine to remain useful, so has joined the web side of the battle (e.g. <a href="http://techcrunch.com/2015/09/01/death-to-app-install-interstitials/">punishing websites</a> that have interstitial app ads, developing <a href="https://www.ampproject.org/">technologies</a> that reduce website loading times). The realistic danger isn’t that the web disappears, but that it gets marginalized, and that the bulk of monetizable internet activities happen in apps or other interfaces like voice or messaging bots. This shift could have a significant effect on web publishers who rely on older business models like non-native ads, and could make it harder for small startups to grow beyond niche use cases.</p>
<p><strong>Video: from TV to mobile devices.</strong> Internet companies are betting that video consumption will continue to shift from TV to mobile devices. The hope is that this will not only create compelling user experiences, but also unlock access to the tens of billions of ad dollars that are currently spent on TV.</p>
<blockquote>
<p>“I think video is a mega trend, almost as big as mobile.” — <a href="https://twitter.com/cdixon/status/706198805922902018">Mark Zuckerberg</a></p>
</blockquote>
<p>Last decade, the internet won the market for ads that harvest purchasing intent (ads that used to appear in newspapers and yellow pages), with most of the winnings going to Google. The question for the next decade is who will win the market for ads that generate purchasing intent (so far the winner is Facebook, followed by Google). Most likely this will depend on who controls the user flow to video advertising. Today, the biggest video platforms are Facebook and YouTube, but expect video to get embedded into almost every internet service, similar to how the internet transitioned from text-heavy to image-heavy services last decade.</p>
<p><strong>Voice: baking search into the OS.</strong> Voice bots like Siri, Google Now, and Alexa embed search-like capabilities directly into the operating system. Today, the quality of voice interfaces isn’t good enough to replace visual computing interfaces for most activities. However, artificial intelligence is <a href="https://medium.com/software-is-eating-the-world/what-s-next-in-computing-e54b870b80cc#.kyn1qnbvj">improving</a> rapidly. Voice bots should be be able to handle much more nuanced and interactive conversations in the near future.</p>
<p>Amazon’s <a href="https://developer.amazon.com/public/solutions/alexa/alexa-voice-service">vision</a> here is the most ambitious: to embed voice services in every possible device, thereby reducing the importance of the device, OS, and application layers (it’s no coincidence that those are also the layers in which Amazon is the weakest). But all the big tech companies are investing heavily in voice and AI. As Google CEO Sundar Pichai recently <a href="https://googleblog.blogspot.com/2016/04/this-years-founders-letter.html">said</a>:</p>
<blockquote>
<p>The next big step will be for the very concept of the “device” to fade away. Over time, the computer itself — whatever its form factor — will be an intelligent assistant helping you through your day. We will move from mobile first to an AI first world.</p>
</blockquote>
<p>This would mean that AI interfaces — which in most cases will mean voice interfaces — could become the master routers of the internet economic loop, rendering many of the other layers interchangeable or irrelevant. Voice is mostly a novelty today, but in technology the <a href="/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy/">next big thing</a> often starts out looking that way.</p>
]]></content:encoded>
</item>
<item>
<title>What’s Next in Computing?</title>
<link>https://cdixon.org/2016/02/21/what-s-next-in-computing/</link>
<guid>https://cdixon.org/2016/02/21/what-s-next-in-computing/</guid>
<pubDate>Sun, 21 Feb 2016 00:00:00 GMT</pubDate>
<description>The computing industry progresses in two mostly independent cycles: financial and product cycles. There has been a lot of handwringing lately about where we are in the financial cycle. Financial ...</description>
<content:encoded><![CDATA[<p>The computing industry progresses in two mostly independent cycles: financial and product cycles. There has been a lot of handwringing lately about where we are in the financial cycle. Financial markets get a lot of attention. They tend to fluctuate unpredictably and sometimes wildly. The product cycle by comparison gets relatively little attention, even though it is what actually drives the computing industry forward. We can try to understand and predict the product cycle by studying the past and extrapolating into the future.</p>
<p><img src="images/1_Gzmn-yCmeOGEVPrrq9esMA.png" alt="New computing eras have occurred every 10–15 years"></p>
<p>Tech product cycles are mutually reinforcing interactions between platforms and applications. New platforms enable new applications, which in turn make the new platforms more valuable, creating a positive feedback loop. Smaller, offshoot tech cycles happen all the time, but every once in a while — historically, about every 10 to 15 years — major new cycles begin that completely reshape the computing landscape.</p>
<p><img src="images/1_oOZjdUvjYRlrFtYUKLIMGg.png" alt="Financial and product cycles evolve mostly independently"></p>
<p>The PC enabled entrepreneurs to create word processors, spreadsheets, and many other desktop applications. The internet enabled search engines, e-commerce, e-mail and messaging, social networking, SaaS business applications, and many other services. Smartphones enabled mobile messaging, mobile social networking, and on-demand services like ride sharing. Today, we are in the middle of the mobile era. It is likely that many more mobile innovations are still to come.</p>
<p>Each product era can be divided into two phases: 1) <em>the gestation phase</em>, when the new platform is first introduced but is expensive, incomplete, and/or difficult to use, 2) <em>the growth phase</em>, when a new product comes along that solves those problems, kicking off a period of exponential growth.</p>
<p>The Apple II was released in 1977 (and the Altair in 1975), but it was the release of the IBM PC in 1981 that kicked off the PC growth phase.</p>
<p><img src="images/1_vfatwon6YWQGRvYad2ggqw.png" alt="PC sales per year (thousands), source: http://jeremyreimer.com/m-item.lsp?i=137"></p>
<p>The internet’s gestation phase took place in the <a href="https://en.wikipedia.org/wiki/National_Science_Foundation_Network">80s and early 90s</a> when it was mostly a text-based tool used by academia and government. The release of the Mosaic web browser in 1993 started the growth phase, which has continued ever since.</p>
<p><img src="images/1_6jgrfjHpBKlObla1x0NYtg.png" alt="Worldwide internet users, source: http://churchm.ag/numbers-internet-use/"></p>
<p>There were feature phones in the 90s and early smartphones like the Sidekick and Blackberry in the early 2000s, but the smartphone growth phase really started in 2007–8 with the release of the iPhone and then Android. Smartphone adoption has since exploded: about 2B people have smartphones today. By 2020, <a href="http://ben-evans.com/benedictevans/2014/10/28/presentation-mobile-is-eating-the-world">80% of the global population</a> will have one.</p>
<p><img src="images/1_8o0-IQSyDQ0KRxSVV2njdA.png" alt="Worldwide smartphone sales per year (millions)"></p>
<p>If the 10–15 year pattern repeats itself, the next computing era should enter its growth phase in the next few years. In that scenario, we should already be in the gestation phase. There are a number of important trends in both hardware and software that give us a glimpse into what the next era of computing might be. Here I talk about those trends and then make some suggestions about what the future might look like.</p>
<h2>Hardware: small, cheap, and ubiquitous</h2>
<p>In the mainframe era, only large organizations could afford a computer. Minicomputers were affordable by smaller organization, PCs by homes and offices, and smartphones by individuals.</p>
<p><img src="images/1_gZQE6-shm1dqgJAbmNn6ww.png" alt="Computers are getting steadily smaller, source: http://www.nature.com/news/the-chips-are-down-for-moore-s-law-1.19338"></p>
<p>We are now entering an era in which processors and sensors are getting so small and cheap that there will be many more computers than there are people.</p>
<p>There are two reasons for this. One is the steady progress of the semiconductor industry over the past 50 years (<a href="https://en.wikipedia.org/wiki/Moore%27s_law">Moore’s law</a>). The second is what Chris Anderson <a href="http://foreignpolicy.com/2013/04/29/epiphanies-from-chris-anderson/">calls</a> “the peace dividend of the smartphone war”: the runaway success of smartphones led to massive investments in processors and sensors. If you disassemble a modern drone, VR headset, or IoT devices, you’ll find mostly smartphone components.</p>
<p>In the modern semiconductor era, the focus has shifted from standalone CPUs to <a href="https://medium.com/@magicsilicon/how-the-soc-is-displacing-the-cpu-49bc7503edab#.h6wfmbk8n">bundles</a> of specialized chips known as systems-on-a-chip.</p>
<p><img src="images/1_SwUUpb2cjLIPFa3-8U9LzQ.png" alt="Computer prices have been steadily dropping, souce: https://medium.com/@magicsilicon/computing-transitions-22c07b9c457a#.j4cm9m6qu%5C"></p>
<p>Typical systems-on-a-chip bundle energy-efficient ARM CPUs plus specialized chips for graphics processing, communications, power management, video processing, and more.</p>
<p><img src="images/1_Wz-CMXmQFd64yFKWFfHefQ.jpeg" alt="Raspberry Pi Zero: 1 GHz Linux computer for $5"></p>
<p>This new architecture has dropped the price of basic computing systems from about $100 to about $10. The <a href="https://www.raspberrypi.org/blog/raspberry-pi-zero/">Raspberry Pi Zero</a> is a 1 GHz Linux computer that you can buy for $5. For a similar price you can buy a <a href="http://makezine.com/2015/04/01/esp8266-5-microcontroller-wi-fi-now-arduino-compatible/">wifi-enabled microcontroller</a> that runs a version of Python. Soon these chips will cost less than a dollar. It will be cost-effective to embed a computer in almost anything.</p>
<p>Meanwhile, there are still impressive performance improvements happening in high-end processors. Of particular importance are GPUs (graphics processors), the best of which are made by Nvidia. GPUs are useful not only for traditional graphics processing, but also for machine learning algorithms and virtual/augmented reality devices. Nvidia’s <a href="http://www.extremetech.com/gaming/201417-nvidias-2016-roadmap-shows-huge-performance-gains-from-upcoming-pascal-architecture">roadmap</a> promises significant performance improvements in the coming years.</p>
<p><img src="images/1_jSQ-qKGSVgW4rSwA0dk9ZQ.png" alt="Google’s quantum computer, source: https://www.technologyreview.com/s/544421/googles-quantum-dream-machine/"></p>
<p>A wildcard technology is quantum computing, which today exists mostly in laboratories but if made commercially viable could lead to orders-of-magnitude performance improvements for certain classes of algorithms in fields like biology and artificial intelligence.</p>
<h2>Software: the golden age of AI</h2>
<p>There are many exciting things happening in software today. Distributed systems is one good example. As the number of devices has grown exponentially, it has become increasingly important to 1) parallelize tasks across multiple machines 2) communicate and coordinate among devices. Interesting distributed systems technologies include systems like <a href="http://hadoop.apache.org/">Hadoop</a> and <a href="https://amplab.cs.berkeley.edu/projects/spark-lightning-fast-cluster-computing/">Spark</a> for parallelizing big data problems, and Bitcoin/blockchain for securing data and assets.</p>
<p>But perhaps the most exciting software breakthroughs are happening in artificial intelligence (AI). AI has a long history of hype and disappointment. Alan Turing himself <a href="http://loebner.net/Prizef/TuringArticle.html">predicted</a> that machines would be able to successfully imitate humans by the year 2000. However, there are good reasons to think that AI might now finally be entering a golden age.</p>
<blockquote>
<p>“Machine learning is a core, transformative way by which we’re rethinking everything we’re doing.” — Google CEO, Sundar Pichai</p>
</blockquote>
<p>A lot of the excitement in AI has focused on deep learning, a machine learning technique that was <a href="http://www.nytimes.com/2012/06/26/technology/in-a-big-network-of-computers-evidence-of-machine-learning.html?pagewanted=all">popularized</a> by a now famous 2012 Google project that used a giant cluster of computers to learn to identify cats in YouTube videos. Deep learning is a descendent of neural networks, a technology that <a href="https://en.wikipedia.org/wiki/Artificial_neural_network#History">dates back</a> to the 1940s. It was brought back to life by a <a href="http://www.wired.com/2014/10/future-of-artificial-intelligence/">combination</a> of factors, including new algorithms, cheap parallel computation, and the widespread availability of large data sets.</p>
<p><img src="images/1_P4BXse9pJYAUbasCEkQanA.png" alt="ImageNet challenge error rates, souce: http://www.slideshare.net/nervanasys/sd-meetup-12215 (red line = human performance)"></p>
<p>It’s tempting to dismiss deep learning as another Silicon Valley buzzword. The excitement, however, is supported by impressive theoretical and real-world results. For example, the error rates for the winners of the <a href="http://image-net.org/challenges/LSVRC/2015/">ImageNet challenge</a> — a popular machine vision contest — were in the 20–30% range prior to the use of deep learning. Using deep learning, the accuracy of the winning algorithms has steadily improved, and in 2015 surpassed human performance.</p>
<p>Many of the papers, <a href="https://code.google.com/archive/p/word2vec/">data</a> <a href="http://image-net.org/download-images">sets</a>, and <a href="https://www.tensorflow.org/">software</a> <a href="http://deeplearning.net/software/theano/">tools</a> related to deep learning have been open sourced. This has had a democratizing effect, allowing individuals and small organizations to build powerful applications. WhatsApp was able to build a global messaging system that <a href="http://www.wired.com/2015/09/whatsapp-serves-900-million-users-50-engineers/">served 900M users with just 50 engineers</a>, compared to the thousands of engineers that were needed for prior generations of messaging systems. This “<a href="https://twitter.com/cdixon/status/473221599189954562">WhatsApp effect</a>” is now happening in AI. Software tools like <a href="http://deeplearning.net/software/theano/">Theano</a> and <a href="https://www.tensorflow.org/">TensorFlow</a>, combined with cloud data centers for training, and inexpensive GPUs for deployment, allow small teams of engineers to build state-of-the-art AI systems.</p>
<p>For example, here a <a href="http://tinyclouds.org/colorize/">solo programmer</a> working on a side project used TensorFlow to colorize black-and-white photos:</p>
<p><img src="images/1_L6cT-HQMC-mc34kJ450pdA.png" alt="Left: black and white. Middle: automatically colorized. Right: true color. source: http://tinyclouds.org/colorize/"></p>
<p>And here a small startup created a real-time object classifier:</p>
<p><img src="images/1_cAtej8oZh2u80cii--YgTw.gif" alt="Teradeep real-time object classifier, source: https://www.youtube.com/watch?v=_wXHR-lad-Q "></p>
<p>Which of course is reminiscent of a famous scene from a sci-fi movie:</p>
<p><img src="images/1_wiG-xc456HpdBkRTQi84Eg.gif" alt="The Terminator (1984), source: https://www.youtube.com/watch?v=YvRb9jZ9wFk"></p>
<p>One of the first applications of deep learning released by a big tech company is the search function in Google Photos, which is <a href="http://gizmodo.com/google-photos-hands-on-so-good-im-creeped-out-1707566376">shockingly</a> smart.</p>
<p><img src="images/1_N1K_Wv2M-QDMF7FeOmJfcw.gif" alt="User searches photos (w/o metadata) for “big ben”"></p>
<p>We’ll soon see significant upgrades to the intelligence of all sorts of products, including: voice assistants, search engines, <a href="http://www.wired.com/2015/08/how-facebook-m-works/">chat bots</a>, 3D <a href="https://www.google.com/atap/project-tango/">scanners</a>, language translators, automobiles, drones, medical imaging systems, and much more.</p>
<blockquote>
<p>The business plans of the next 10,000 startups are easy to forecast: Take X and add AI. This is a big deal, and now it’s here. — <a href="http://www.wired.com/2014/10/future-of-artificial-intelligence/">Kevin Kelly</a></p>
</blockquote>
<p>Startups building AI products will need to stay laser focused on specific applications to compete against the big tech companies who have made AI a top priority. AI systems get better as more data is collected, which means it’s possible to create a virtuous flywheel of <a href="http://mattturck.com/2016/01/04/the-power-of-data-network-effects/">data network effects</a> (more users → more data → better products → more users). The mapping startup Waze <a href="https://digit.hbs.org/submission/waze-generating-better-maps-through-its-network-of-users/">used</a> data network effects to produce better maps than its vastly better capitalized competitors. Successful AI startups will follow a <a href="/2015/02/01/the-ai-startup-idea-maze/">similar</a> strategy.</p>
<h2>Software + hardware: the new computers</h2>
<p>There are a variety of new computing platforms currently in the gestation phase that will soon get much better — and possibly enter the growth phase — as they incorporate recent advances in hardware and software. Although they are designed and packaged very differently, they share a common theme: they give us new and augmented abilities by embedding a smart virtualization layer on top of the world. Here is a brief overview of some of the new platforms:</p>
<p><strong>Cars</strong>. Big tech companies like Google, Apple, Uber, and Tesla are investing significant resources in autonomous cars. Semi-autonomous cars like the Tesla Model S are already publicly available and will improve quickly. Full autonomy will take longer but is probably not more than 5 years away. There already exist fully autonomous cars that are almost as good as human drivers. However, for cultural and regulatory reasons, fully autonomous cars will likely need to be significantly better than human drivers before they are widely permitted.</p>
<p><img src="images/1_nJjPHXo_qBtzvoH8OLx9hQ.gif" alt="Autonomous car mapping its environment"></p>
<p>Expect to see a lot more investment in autonomous cars. In addition to the big tech companies, the big auto makers <a href="http://www.cnet.com/roadshow/news/gm-new-team-electric-autonomous-cars/">are</a> <a href="http://spectrum.ieee.org/automaton/robotics/industrial-robots/toyota-to-invest-1-billion-in-ai-and-robotics-rd">starting</a> <a href="https://media.ford.com/content/fordmedia/fna/us/en/news/2016/01/05/ford-tripling-autonomous-vehicle-development-fleet--accelerating.html">to</a> take autonomy very seriously. You’ll even see some interesting products made by startups. Deep learning software tools have gotten so good that a <a href="http://www.bloomberg.com/features/2015-george-hotz-self-driving-car/">solo programmer</a> was able to make a semi-autonomous car:</p>
<p><img src="images/1_z442b_u8RfSqBEyI-1AkxQ.gif" alt="Homebrew self-driving car, source: https://www.youtube.com/watch?v=KTrgRYa2wbI"></p>
<p><strong>Drones</strong>. Today’s consumer drones contain modern hardware (mostly smartphone components plus mechanical parts), but relatively simple software. In the near future, we’ll see drones that incorporate advanced computer vision and other AI to make them safer, easier to pilot, and more useful. Recreational videography will continue to be popular, but there will also be important <a href="http://www.airware.com">commercial</a> use cases. There are tens of millions of <a href="http://www.psmag.com/politics-and-law/cell-tower-climbers-die-78374">dangerous</a> jobs that involve climbing buildings, towers, and other structures that can be performed much more safely and effectively using drones.</p>
<p><img src="images/1_N7SlK3WKwkfZ6v50JFLkCg.gif" alt="Fully autonomous drone flight. source: https://www.youtube.com/watch?v=rYhPDn48-Sg"></p>
<p><strong>Internet of Things</strong>. The obvious use cases for IoT devices are energy savings, security, and convenience. <a href="https://nest.com/thermostat/meet-nest-thermostat/">Nest</a> and <a href="https://nest.com/camera/meet-nest-cam/">Dropcam</a> are popular examples of the first two categories. One of the most interesting products in the convenience category is Amazon’s <a href="http://www.amazon.com/Amazon-SK705DI-Echo/dp/B00X4WHP5E">Echo</a>.</p>
<p><img src="images/1_bsxhmUfI-7biIF-dW8a80w.png" alt="Three main uses cases for IoT"></p>
<p>Most people think Echo is a gimmick until they try it and then they are <a href="http://qz.com/611026/amazon-echo-is-a-sleeper-hit-and-the-rest-of-america-is-about-find-out-about-it-for-the-first-time/">surprised</a> at how useful it is. It’s a great <a href="https://500ish.com/alexa-5f7924bffcf3#.iou9jsaj4">demo</a> of how effective always-on voice can be as a user interface. It will be a while before we have bots with generalized intelligence that can carry on full conversations. But, as Echo shows, voice can succeed today in constrained contexts. Language understanding should improve quickly as recent breakthroughs in deep learning make their way into production devices.</p>
<p>IoT will also be adopted in business contexts. For example, devices with sensors and network connections are extremely <a href="https://www.samsara.com/">useful</a> for monitoring industrial equipment.</p>
<p><strong>Wearables.</strong> Today’s wearable computers are constrained along multiple dimensions, including battery, communications, and processing. The ones that have succeeded have focused on narrow applications like fitness monitoring. As hardware components continue to improve, wearables will support rich applications the way smartphones do, unlocking a wide range of new applications. As with IoT, voice will probably be the main user interface.</p>
<p><img src="images/1__4r-bIpz7jWMYiLnxKFCJQ.gif" alt="Wearable, super intelligent AI earpiece in the movie “Her”"></p>
<p><strong>Virtual Reality.</strong> 2016 is an exciting year for VR: the launch of the <a href="https://www.oculus.com/en-us/rift/">Oculus Rift</a> and HTC/Valve <a href="https://www.htcvive.com/us/">Vive</a> (and, possibly, the Sony Playstation VR), means that comfortable and immersive VR systems will finally be publicly available. VR systems need to be really good to avoid the “<a href="https://en.wikipedia.org/wiki/Uncanny_valley">uncanny valley</a>” trap. Proper VR requires special screens (high resolution, high refresh rate, low persistence), powerful graphics cards, and the ability to track the precise position of the user (previously released VR systems could only track the rotation of the user’s head). This year, the public will for the first time get to experience what is known as “<a href="http://a16z.com/2015/01/22/virtual-reality/">presence</a>” — when your senses are sufficiently tricked that you feel fully transported into the virtual world.</p>
<p><img src="images/1_bcHvjQwlLxyORwjHFH87Qg.gif" alt="Oculus Rift Toybox demo"></p>
<p>VR headsets will continue to improve and get more affordable. Major areas of research will include: 1) new tools for creating rendered and/or <a href="https://www.lytro.com/">filmed</a> VR content, 2) machine vision for <a href="http://venturebeat.com/2016/02/08/oculus-vr-guru-john-carmack-leads-crucial-position-tracking-development-for-mobile-vr/">tracking</a> and scanning directly from phones and headsets, and 3) distributed back-end <a href="/2015/03/24/improbable-enabling-the-development-of-large-scale-simulated-worlds/">systems</a> for hosting large <a href="https://twitter.com/cdixon/status/662836035508940800">virtual environments</a>.</p>
<p><img src="images/1_Fv9_4fCAOHoEA3dxjMf2jw.gif" alt="3D world creation in room-scale VR"></p>
<p><strong>Augmented Reality</strong>. AR will likely arrive after VR because AR requires most of what VR requires plus additional new technologies. For example, AR requires advanced, low-latency machine vision in order to convincingly combine real and virtual objects in the same interactive scene.</p>
<p><img src="images/1_HpWBUZD_kKAoTa2yuxqnTQ.jpeg" alt="Real and virtual combined (from The Kingsmen)"></p>
<p>That said, AR is probably coming sooner than you think. This demo video was shot directly through <a href="http://www.magicleap.com/#/home">Magic Leap’s</a> AR device:</p>
<p><img src="images/1_7jbz4N1GZTFm0wDzDEmQ1Q.gif" alt="Magic Leap demo: real environment, virtual character"></p>
<h2>What’s next?</h2>
<p>It is possible that the pattern of 10–15 year computing cycles has ended and mobile is the final era. It is also possible the next era won’t arrive for a while, or that only a subset of the new computing categories discussed above will end up being important.</p>
<p>I tend to think we are on the cusp of not one but multiple new eras. The “peace dividend of the smartphone war” created a Cambrian explosion of new devices, and developments in software, especially AI, will make those devices smart and useful. Many of the futuristic technologies discussed above exist today, and will be broadly accessible in the near future.</p>
<p>Observers have noted that many of these new devices are in their “<a href="http://www.nytimes.com/2016/01/07/technology/on-display-at-ces-tech-ideas-in-their-awkward-adolescence.html?_r=0">awkward adolescence</a>.” That is because they are in their gestation phase. Like PCs in the 70s, the internet in the 80s, and smartphones in the early 2000s, we are seeing pieces of a future that isn’t quite here. But the future is coming: markets go up and down, and excitement ebbs and flows, but computing technology marches steadily forward.</p>
]]></content:encoded>
</item>
</channel>
</rss>
<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
<title>cdixon</title>
<link>https://cdixon.org</link>
<description>programming, philosophy, history, internet, startups</description>
<atom:link href="https://cdixon.org/rss.xml" rel="self" type="application/rss+xml"/>
<item>
<title>I wrote a book: Read Write Own</title>
<link>https://cdixon.org/2023/06/22/read-write-own/</link>
<guid>https://cdixon.org/2023/06/22/read-write-own/</guid>
<pubDate>Thu, 22 Jun 2023 00:00:00 GMT</pubDate>
<description>I wrote a book: Read Write Own I believe blockchains and the software movement around them – typically called crypto or web3 – provide the only plausible path to sustaining ...</description>
<content:encoded><![CDATA[<p>I wrote a book: <em>Read Write Own</em></p>
<p>I believe blockchains and the software movement around them – typically called crypto or web3 – provide the only plausible path to sustaining the original vision of the internet as an open platform that incentivizes creativity and entrepreneurship. I’ve been investing behind this thesis for years, and advocating for it through writing and speaking and by talking to business leaders, journalists, and policymakers both here and around the world.</p>
<p>Through all that, it became clear that we need a comprehensive book that clearly explains new technologies like blockchains and the services built on top of them; how they fit into the history of the internet; and why they should matter to founders, developers, creators, and anyone interested in the history and evolution of business, technology, and innovation.</p>
<p>So I wrote that book: <em>Read Write Own: Building the Next Era of the Internet.</em></p>
<p>My thesis is that seemingly small initial decisions around software and network design can have profound downstream consequences on the control and economics of digital services. The book walks through the history of the internet, showing how it has gone through three major design eras: the first focused on democratizing information (read), the second on democratizing publishing (write), and the third on democratizing ownership (own).</p>
<p>We are on the cusp of the third era – own – so I explain the key concepts underlying it, including blockchains and digital services built on top of blockchains. The book therefore answers a common question I hear: “<em>What problems do blockchains solve?</em>” Blockchains solve the same problems that other digital services solve, but with better outcomes. They can connect people in social networks, while empowering users over corporate interests. They can underpin marketplaces and payment systems that facilitate commerce, but with persistently lower take rates. They can enable new forms of monetizable media, interoperable and immersive digital worlds, and artificial intelligence services that compensate – rather than cannibalize – creators and communities.</p>
<p>The book takes controversial questions head on, including policy and regulatory topics, and the harmful “casino” culture that has developed around crypto that hurts public perception and undermines its potential. And I go deeper into intersecting topics like artificial intelligence, social networks, finance, media businesses, collaborative creation, video games, and virtual worlds.</p>
<p>Inspired by modern tech classics like <em>Zero to One</em> and <em>The Hard Thing About Hard Things</em>, I wrote the book to be succinct, thorough, and accessible. I also distill cutting-edge thinking from technologists and founders to make it useful to practitioners. My goal was to make it accessible without watering it down. The book is meant for a range of audiences, including entrepreneurs, technologists, company leaders, policymakers, journalists, business thinkers, artists, community builders, and people who are simply curious about new technologies, culture, and the future of the internet.</p>
<p>I love reading books but believe that tech and business topics usually work better in shorter formats, which is why in the past I’ve stuck to blogging and tweeting. But accomplishing all of the above warranted a longer treatment, bringing new and different ideas together in one place. So I spent much of the last year doing this. Many of the ideas I’ve thought about for a long time but never took the time to write.</p>
<p><em>Read Write Own: Building the Next Era of the Internet</em> will be published by Random House on March 12, 2024. You can pre-order it <a href="https://readwriteown.com">here</a>.</p>
<p>Sign up for more book updates <a href="https://cdixon.substack.com">here</a>.</p>
<hr>
<p><a href="https://readwriteown.com/terminologyhistory/">More about the term and title “Read Write Own” here.</a></p>
]]></content:encoded>
</item>
<item>
<title>NFTs and A Thousand True Fans</title>
<link>https://cdixon.org/2021/02/27/NFTs-and-a-thousand-true-fans/</link>
<guid>https://cdixon.org/2021/02/27/NFTs-and-a-thousand-true-fans/</guid>
<pubDate>Sat, 27 Feb 2021 00:00:00 GMT</pubDate>
<description>In his classic 2008 essay “1000 True Fans,” Kevin Kelly predicted that the internet would transform the economics of creative activities: To be a successful creator you don’t need millions. ...</description>
<content:encoded><![CDATA[<p align="center"><img src="images/nfts.png"/></p>
<p>In his classic 2008 essay “<a href="https://kk.org/thetechnium/1000-true-fans/">1000 True Fans</a>,” Kevin Kelly predicted that the internet would transform the economics of creative activities:</p>
<blockquote>
<p>To be a successful creator you don’t need millions. You don’t need millions of dollars or millions of customers, millions of clients or millions of fans. To make a living as a craftsperson, photographer, musician, designer, author, animator, app maker, entrepreneur, or inventor you need only thousands of true fans.</p>
</blockquote>
<blockquote>
<p>A true fan is defined as a fan that will buy anything you produce. These diehard fans will drive 200 miles to see you sing; they will buy the hardback and paperback and audible versions of your book; they will purchase your next figurine sight unseen; they will pay for the “best-of” DVD version of your free YouTube channel; they will come to your chef’s table once a month.</p>
</blockquote>
<p>Kelly’s vision was that the internet was the ultimate matchmaker, enabling 21st century patronage. Creators, no matter how seemingly niche, could now discover their true fans, who would in turn demonstrate their enthusiasm through direct financial support.</p>
<p>But the internet took a detour. Centralized social platforms became the dominant way for creators and fans to connect. The platforms used this power to become the new intermediaries — inserting ads and algorithmic recommendations between creators and users while keeping most of the revenue for themselves.</p>
<p>The good news is that the internet is trending back to Kelly’s vision. For example, many top writers on Substack earn far more than they did at salaried jobs. The economics of low take rates plus enthusiastic fandom does wonders. On Substack, 1,000 newsletter subscribers paying $10/month nets over $100K/year to the writer.</p>
<p>Crypto, and specifically <a href="https://variant.mirror.xyz/T8kdtZRIgy_srXB5B06L8vBqFHYlEBcv6ae2zR6Y_eo">NFTs</a> (non-fungible tokens), can accelerate the trend of creators monetizing directly with their fans. Social platforms will continue to be useful for building audiences (although these too should probably be replaced with superior <a href="https://cdixon.org/2018/02/18/why-decentralization-matters">decentralized</a> alternatives), but creators can increasingly rely on other methods including NFTs and crypto-enabled economies to make money.</p>
<p>NFTs are blockchain-based records that uniquely represent pieces of media. The media can be anything digital, including art, videos, music, gifs, games, text, memes, and code. NFTs contain highly trustworthy documentation of their history and origin, and can have code attached to do almost anything programmers dream up (one popular feature is code that ensures that the original creator receives royalties from secondary sales). NFTs are secured by the same technology that enabled Bitcoin to be owned by hundreds of millions of people around the world and represent hundreds of billions of dollars of value.</p>
<p>NFTs have received a lot of attention lately because of high sales volumes. In the past 30 days there has been over <a href="http://cryptoslam.io">$300M</a> in NFT sales:</p>
<p align="center"><img src="images/pic1.png"/></p>
<p>Crypto has a history of boom and bust cycles, and it’s very possible NFTs will have their own ups and downs.</p>
<p>That said, there are three important reasons why NFTs offer fundamentally better economics for creators. The first, already alluded to above, is by removing rent-seeking intermediaries. The logic of blockchains is once you purchase an NFT it is yours to fully control, just like when you buy books or sneakers in the real world. There are and will continue to be NFT platforms and marketplaces, but they will be constrained in what they can charge because blockchain-based ownership shifts the power back to creators and users — you can shop around and force the marketplace to earn its fees. (Note that lowering the intermediary fees can have a multiplier effect on creator disposable income. For example, if you make $100K in revenue and have $80K in costs, cutting out a 50% take rate increases your revenue to $200K, multiplying your disposable income 6x, from $20K to $120K.)</p>
<p>The second way NFTs change creator economics is by enabling granular price tiering. In ad-based models, revenue is generated more or less uniformly regardless of the fan’s enthusiasm level. As with Substack, NFTs allow the creator to “cream skim” the most passionate users by offering them special items which cost more. But NFTs go farther than non-crypto products in that they are easily sliced and diced into a descending series of pricing tiers. NBA Top Shot cards range from over $100K to a few dollars. Fan of Bitcoin? You can buy as much or little as you want, down to 8 decimal points, depending on your level of enthusiasm. Crypto’s fine-grained granularity lets creators capture a much larger area under the demand curve.</p>
<p align="center"><img src="images/pic2.png"/></p>
<p>The third and most important way NFTs change creator economics is by making users owners, thereby reducing customer acquisition costs to near zero. Open any tech S-1 filing and you’ll see massive user/customer acquisition costs, usually going to online ads or sales staff. Crypto, by contrast, has grown to over a trillion dollars in aggregate market capitalization with almost no marketing spend. Bitcoin and Ethereum don’t have organizations behind them let alone marketing budgets, yet are used, owned, and loved by tens of millions of people.</p>
<p>The highest revenue NFT project to date, <a href="https://www.nbatopshot.com/">NBA Top Shot</a>, has generated $200M in gross sales in just the past month while spending very little on marketing. It’s been able to grow so efficiently because users feel like owners — they have skin in the game. It’s true peer-to-peer marketing, fueled by community, <a href="https://twitter.com/ROSGO21/status/1364724500642689027?s=20">excitement</a>, and ownership.</p>
<p align="center"><img src="images/pic3.jpg"/></p>
<p>NFTs are still early, and will evolve. Their utility will increase as digital experiences are built around them, including marketplaces, social networks, showcases, games, and virtual worlds. It’s also likely that other consumer-facing crypto products emerge that pair with NFTs. Modern video games like Fortnite contain sophisticated economies that mix fungible tokens like V-Bucks with NFTs/virtual goods like skins. Someday every internet community might have its own micro-economy, including NFTs and fungible tokens that users can use, own, and collect.</p>
<p>The thousand true fans thesis builds on the original ideals of the internet: users and creators globally connected, unconstrained by intermediaries, sharing ideas and economic upside. Incumbent social media platforms sidetracked this vision by locking creators into a bundle of distribution and monetization. There are, correspondingly, two ways to challenge them: take the users, or take the money. Crypto and NFTs give us a new way to take the money. Let’s make it happen.</p>
<p><em>(Image: CryptoPunks — Larva Labs)</em></p>
]]></content:encoded>
</item>
<item>
<title>Doing old things better vs doing brand new things</title>
<link>https://cdixon.org/2020/10/19/doing-old-things-better-vs-doing-brand-new-things/</link>
<guid>https://cdixon.org/2020/10/19/doing-old-things-better-vs-doing-brand-new-things/</guid>
<pubDate>Mon, 19 Oct 2020 00:00:00 GMT</pubDate>
<description>New technologies enable activities that fall into one of two categories: 1) doing things you could already do but can now do better because they are faster, cheaper, easier, higher ...</description>
<content:encoded><![CDATA[<p>New technologies enable activities that fall into one of two categories: 1) doing things you could already do but can now do better because they are faster, cheaper, easier, higher quality, etc. 2) doing brand new things that you simply couldn’t do before. Early in the development of new technologies, the first category tends to get more attention, but it’s the second that ends up having more impact on the world.</p>
<p>Doing old things better tends to get more attention early on because it’s easier to imagine what to build. Early films were shot like plays — they were effectively plays with a better distribution model — until filmmakers realized that movies had their own visual grammar. The early electrical grid delivered light better than gas and candles. It took decades before we got an electricity “app store” — a rich ecosystem of appliances that connected to the grid. The early web was mostly digital adaptations of pre-internet things like letter writing and mail-order commerce. It wasn’t until the 2000s that entrepreneurs started exploring “internet native” ideas like social networking, crowdfunding, cryptocurrency, crowdsourced knowledge bases, and so on.</p>
<p>The most common mistake people make when evaluating new technologies is to focus too much on the “doing old things better” category. For example, when evaluating the potential of blockchains, people sometimes focus on things like cheaper and faster global payments, which are important and necessary but only the beginning. What’s even more exciting are the new things you simply couldn’t create before, like internet services that are <a href="https://cdixon.org/2019/01/04/how-blockchain-can-wrest-the-internet-from-corporations">owned and operated by their users</a> instead of by companies. Another example is business productivity apps architected as web services. Early products like Salesforce were easier to access and cheaper to maintain than their on-premise counterparts. Modern productivity apps like Google Docs, Figma, and Slack focus on things you simply couldn’t do before, like real-time collaboration and deep integrations with other apps.</p>
<p>Entrepreneurs who create products in the “brand new things” category usually spend many years deeply immersed in the underlying technology before they have their key insights. The products they create often <a href="https://cdixon.org/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy">start out looking toy-like</a>, <a href="https://cdixon.org/2019/01/08/strong-and-weak-technologies">strange, unserious, expensive</a>, and sometimes even dangerous. Over time, the products steadily improve and the world gradually embraces them.</p>
<p>It can take decades for this process to play out. It’s clear that we are early in the development of emerging technologies like cryptocurrencies, machine learning, and virtual reality. It is also possible we are still early in the development of more established technologies like mobile devices, cloud hosting, social networks, and perhaps even the internet itself. If so, new categories of native products built on top of these technologies will continue to be invented in the coming years.</p>
]]></content:encoded>
</item>
<item>
<title>Computers that can make commitments</title>
<link>https://cdixon.org/2020/01/26/computers-that-can-make-commitments/</link>
<guid>https://cdixon.org/2020/01/26/computers-that-can-make-commitments/</guid>
<pubDate>Sun, 26 Jan 2020 00:00:00 GMT</pubDate>
<description>Blockchains are computers that can make commitments. Traditional computers are ultimately controlled by people, either directly in the case of personal computers or indirectly through organizations. Blockchains invert this power ...</description>
<content:encoded><![CDATA[<p>Blockchains are computers that can make commitments. Traditional computers are ultimately controlled by people, either directly in the case of personal computers or indirectly through organizations. Blockchains invert this power relationship, putting the code in charge. A game theoretic mechanism — a so-called consensus mechanism — makes blockchains resilient to modifications to their underlying physical components, effectively making them resilient to human intervention.</p>
<p>As a result, a properly designed blockchain provides strong guarantees that the code it runs will continue to operate as designed. For the first time, a computer system can be truly autonomous: self-governed, by its own code, instead of by people. Autonomous computers can be relied on and trusted in ways that human-governed computers can’t.</p>
<p>Computers that make commitments can be useful in finance. The most famous example of this is Bitcoin, which makes various commitments, including that there will never be more than 21 million bitcoins, a commitment that makes bitcoins scarce and therefore capable of being valuable. Without a blockchain, this commitment could have been made by a person or a business, but it is unlikely that other people would have really trusted that commitment, since people and businesses change their minds all the time. Prior to Bitcoin, besides precious metals which are naturally scarce, the only credible commitments to monetary scarcity came from governments.</p>
<p>Ethereum was the first blockchain to support a general-purpose programming language, allowing for the creation of arbitrarily complex software that makes commitments. Two early applications built on Ethereum are <a href="https://compound.finance/">Compound</a> and <a href="https://makerdao.com/en/">Maker Dao</a>. Compound makes the commitment that it will act as a neutral, low-fee lending protocol. Maker Dao makes a commitment to maintain the price stability of a currency called Dai that can be used for stable payments and value store. As of today, users have locked up hundreds of millions of dollars in these applications, a testament to the credibility of their commitments.</p>
<p>Applications like Compound and Maker can do things that pre-blockchain software simply couldn’t, such as hold funds that reside in the code itself, as opposed to traditional payment systems which only hold pointers to offline bank accounts. This removes the need to trust anything other than code, and makes the system end-to-end transparent and extensible. Blockchain applications do this autonomously — every human involved in creating these projects could disappear and the software would go on doing what it does, keeping its commitments, indefinitely.</p>
<p>What else can you do with computers that make commitments? One fertile area being explored is re-architecting popular internet services like social networks and marketplaces so that they make strong, positive commitments to their communities. For example, users can get commitments baked into the code that their data will be kept private and that they won’t get de-platformed without due process. Third-party developers can safely invest in their businesses knowing that the rules are baked into the network and can’t change, protecting them from <a href="https://cdixon.org/2018/02/18/why-decentralization-matters">platform risk</a>. Using the financial features of blockchains, users and developers can receive tokens in order to participate in the upside of the network as it grows.</p>
<p>Blockchains have arrived at an opportune time. Internet services have become central to our economic, political, and cultural lives, yet the trust between users and the people who run these services is breaking down. At the same time, industries like finance that have traditionally depended on trust have resisted modernization. The next few years will be exciting — we are only beginning to explore the <a href="https://cdixon.org/2013/08/04/the-idea-maze">idea maze</a> unlocked by this new kind of computer.</p>
]]></content:encoded>
</item>
<item>
<title>Inside-out vs. outside-in: the adoption of new technologies</title>
<link>https://cdixon.org/2020/01/17/inside-out-vs-outside-in/</link>
<guid>https://cdixon.org/2020/01/17/inside-out-vs-outside-in/</guid>
<pubDate>Fri, 17 Jan 2020 00:00:00 GMT</pubDate>
<description>There are broadly two adoption paths for new computing technologies: inside-out and outside-in. Inside-out technologies are pioneered by established institutions and later proliferate outward to the mainstream. Apple (followed by ...</description>
<content:encoded><![CDATA[<p>There are broadly two adoption paths for new computing technologies: inside-out and outside-in. Inside-out technologies are pioneered by established institutions and later proliferate outward to the mainstream. Apple (followed by Google and others) pioneered the modern touchscreen smartphone, university and corporate research labs pioneered machine learning, and big tech companies like Amazon pioneered cloud computing.</p>
<p>Outside-in technologies, by contrast, start out on the fringes and only later move inward to established institutions. Open-source software started out as a niche anti-copyright movement. The web was invented at a physics lab and then built out by hobbyists and entrepreneurs. Social media began as a movement of idealistic blogging enthusiasts.</p>
<p>Inside-out technologies tend to require significant capital and formally trained technical expertise. They also tend to be technologies that most people would recognize as valuable even before they exist. It wasn’t very hard to imagine that affordable, easy-to-use, internet-connected pocket supercomputers would be popular, or that machines that could learn to behave intelligently could do all sorts of useful tasks.</p>
<p>Outside-in technologies tend to require less capital and less formally trained technical skills, creating a much more level playing field between insiders and outsiders. In many cases the value of outside-in technologies is not only unclear before they’re invented, but remains unclear for many years after they’re invented.</p>
<p>Take the example of social media. Early on, blogs and services like Twitter were mostly used to discuss niche tech topics and share mundane personal events. This led many sophisticated observers to <a href="https://www.nytimes.com/2007/04/22/business/yourmoney/22stream.html">dismiss</a> them as toys or passing fads. At its core, however, social media was about the creation of curated information networks. Today, this is easy to see – billions of people rely on services like Twitter and Facebook for their news – but back then you had to cut through the noise generated by the eccentricities of early adopters. Social media is a technology for creating global media networks that arrived disguised as a way to share what you had for lunch.</p>
<p>Both inside-out and outside-in technologies are important, and in fact they’re often mutually reinforcing. Mobile, social, and cloud powered the growth of computing over the last decade: mobile (inside-out) brought computers to billions of people, social (outside-in) drove usage and monetization, and cloud (inside-out) allowed back-end services to scale. Most likely the next major wave in computing will also be driven by a mutually reinforcing combination of technologies, some developed at established institutions and some developed by enthusiastic and possibly misunderstood outsiders.</p>
]]></content:encoded>
</item>
<item>
<title>Strong and weak technologies</title>
<link>https://cdixon.org/2019/01/08/strong-and-weak-technologies/</link>
<guid>https://cdixon.org/2019/01/08/strong-and-weak-technologies/</guid>
<pubDate>Tue, 08 Jan 2019 00:00:00 GMT</pubDate>
<description>During a media tour in 2007 in which Steve Jobs showed the device to reporters, there was one instance in which a journalist criticized the iPhone’s touch-screen keyboard. “It doesn’t ...</description>
<content:encoded><![CDATA[<blockquote>
<p><em>During a <a href="https://www.businessinsider.com/steve-jobs-reaction-first-iphone-2015-9">media tour</a> in 2007 in which Steve Jobs showed the device to reporters, there was one instance in which a journalist criticized the iPhone’s touch-screen keyboard.</em></p>
<p><em>“It doesn’t work,” the reporter said.</em></p>
<p><em>Jobs stopped for a moment and tilted his head. The reporter said he or she kept making typos and the keys were too small for his or her thumbs.</em></p>
<p><em>Jobs smiled and then replied: “Your thumbs will learn.”</em></p>
</blockquote>
<p>When the iPhone was introduced in 2007, it <a href="https://www.wsj.com/articles/behind-the-rise-and-fall-of-blackberry-1432311912">mystified</a> its competitors, because it wasn’t built for the world as it existed. Wireless networks were too slow. Smartphone users only knew how to use physical keyboards. There were no software developers making apps for touchscreen phones. It frequently dropped phone calls.</p>
<p>But the iPhone was such a remarkable device — fans called it “The Jesus Phone” — that the world adapted to it. Carriers built more wireless capacity. Developers invented new apps and interfaces. Users learned how to rapidly type on touchscreens. Apple kept releasing better versions, fixing problems and adding new capabilities.</p>
<p>Smartphones are a good example of a broader historical pattern: technologies usually arrive in pairs, a strong form and a weak form. Here are some examples:</p>
<table class="comparison-table">
<thead>
<tr><th>Strong</th><th>Weak</th></tr>
</thead>
<tbody>
<tr><td>Public internet</td><td>Private intranets</td></tr>
<tr><td>Consumer web</td><td>Interactive TV</td></tr>
<tr><td>Crowdsourced encyclopedia (Wikipedia)</td><td>Expert-curated encyclopedia (e.g. Nupedia, Encarta)</td></tr>
<tr><td>Crowdsourced video (YouTube)</td><td>Video tech for media companies (e.g. RealPlayer)</td></tr>
<tr><td>Internet video chat (Skype)</td><td>Voice-over-IP (e.g. Vonage)</td></tr>
<tr><td>Streaming music (Spotify)</td><td>MP3 downloads (e.g. iTunes)</td></tr>
<tr><td>Touchscreen smartphones with full operating system and app store (iPhone)</td><td>Limited-app smartphones with physical keyboards (e.g. Blackberry)</td></tr>
<tr><td>Fully electric cars (Tesla)</td><td>Hybrid cars</td></tr>
<tr><td>Permissionless blockchains powered by cryptocurrencies</td><td>Permissioned/private blockchains</td></tr>
<tr><td>Public cloud</td><td>Private / hybrid cloud</td></tr>
<tr><td>App-based media companies (e.g. Netflix)</td><td>Video on demand delivered by cable companies</td></tr>
<tr><td>Virtual realty</td><td>Augmented reality</td></tr>
<tr><td>E-sports</td><td>Traditional sports delivered over the internet</td></tr>
</tbody>
</table>
<p>Strong technologies capture the imaginations of technology enthusiasts. That is why many important technologies start out as weekend hobbies. <a href="https://cdixon.org/2013/03/03/what-the-smartest-people-do-on-the-weekend-is-what-everyone-else-will-do-during-the-week-in-ten-years/">Enthusiasts vote with their time</a>, and, unlike most of the business world, have long-term horizons. They build from first principles, making full use of the available resources to design technologies as they ought to exist. Sometimes these enthusiasts run large companies, in which case they are often, like Steve Jobs, founders who have the gravitas and vision to make big, long-term bets.</p>
<p>The mainstream technology world notices the excitement and wants to join in, but isn’t willing to go all the way and embrace the strong technology. To them, the strong technology appears to be some combination of strange, <a href="https://cdixon.org/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy/">toy-like</a>, unserious, expensive, and sometimes even dangerous. So they embrace the weak form, a compromised version that seems more familiar, productive, serious, and safe.</p>
<p>Strong technologies often develop according to the Perez/Gartner hype cycle:</p>
<p><img src="images/researchmethodology-illustration-hype-cycle.jpg" alt=""></p>
<p>During the trough of disillusionment, entrepreneurs and others who invested in strong technologies sometimes lose faith and switch their focus to weak technologies, because the weak technologies appear nearer to mainstream adoption. This is usually a mistake.</p>
<p>That said, weak forms of technology can be successful. For example, it is very likely that augmented reality will be important, watching traditional sports on the internet will be popular, and so on.</p>
<p>But it’s strong technologies that end up defining new eras. What George Bernard Shaw said about people also applies to technologies:</p>
<blockquote>
<p>The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.</p>
</blockquote>
<p>Weak technologies adapt to the world as it currently exists. Strong technologies adapt the world to themselves. Progress depends on strong technologies. Your thumbs will learn.</p>
]]></content:encoded>
</item>
<item>
<title>Who will control the software that powers the Internet?</title>
<link>https://cdixon.org/2019/01/04/how-blockchain-can-wrest-the-internet-from-corporations/</link>
<guid>https://cdixon.org/2019/01/04/how-blockchain-can-wrest-the-internet-from-corporations/</guid>
<pubDate>Fri, 04 Jan 2019 00:00:00 GMT</pubDate>
<description>Originally published by Wired. As the internet has evolved over its 35-year lifespan, control over its most important services has gradually shifted from open source protocols maintained by non-profit communities ...</description>
<content:encoded><![CDATA[<p><em>Originally published by <a href="https://www.wired.com/story/how-blockchain-can-wrest-the-internet-from-corporations/">Wired</a>.</em></p>
<p>As the internet has evolved over its 35-year lifespan, control over its most important services has gradually shifted from open source protocols maintained by non-profit communities to proprietary services operated by large tech companies. As a result, billions of people got access to amazing, free technologies. But that shift also created serious problems.</p>
<p>Millions of users have had their private data misused or stolen. Creators and businesses that rely on internet platforms are subject to sudden rule changes that take away their audiences and profits. But there is a growing movement—emerging from the blockchain and cryptocurrency world—to build new internet services that combine the power of modern, centralized services with the community-led ethos of the original internet. We should embrace it.</p>
<p>From the 1980s through the early 2000s, the dominant internet services were built on open protocols that the internet community controlled. For example, the Domain Name System, the internet’s “phone book,” is controlled by a distributed network of people and organizations, using rules that are created and administered in the open. This means that anyone who adheres to community standards can own a domain name and establish an internet presence. It also means that the power of companies operating web and email hosting is kept in check—if they misbehave, customers can port their domain names to competing providers.</p>
<p>From the mid 2000s to the present, trust in open protocols was replaced by trust in corporate management teams. As companies like Google, Twitter, and Facebook built software and services that surpassed the capabilities of open protocols, users migrated to these more sophisticated platforms. But their code was proprietary, and their governing principles could change on a whim.</p>
<p>How do social networks decide which users to <a href="https://www.wired.com/story/how-right-wing-social-media-site-gab-got-back-online/">verify</a> or <a href="https://www.wired.com/story/tumblrs-porn-ban-reveals-controls-we-see-online/">ban</a>? How do search engines decide how to rank websites? One minute social networks court media organizations and small businesses, the next minute they de-prioritize their content or change the revenue split. The power of these platforms has created widespread societal tensions, as seen in debates over fake news, state-sponsored bots, privacy laws, and algorithmic biases.</p>
<p>That’s why the pendulum is swinging back to an internet governed by open, community-controlled services. This has only recently become possible, thanks to technologies arising from the blockchain and cryptocurrencies.</p>
<p>There has been a lot of talk in the past few years about blockchains, which are heavily hyped but poorly understood. Blockchains are networks of physical computers that work together in concert to form a single virtual computer. The benefit is that, unlike a traditional computer, a blockchain computer can offer strong trust guarantees, rooted in the mathematical and game-theoretic properties of the system. A user or developer can trust that a piece of code running on a blockchain computer will continue to behave as designed, even if individual participants in the network change their motivations or try to subvert the system. This means that the control of a blockchain computer can be placed in the hands of a community.</p>
<p>Users who depend on proprietary platforms, on the other hand, have to worry about data getting stolen or misused, privacy policies changing, intrusive advertising, and more. Proprietary platforms may suddenly change the rules for developers and businesses, the way Facebook <a href="https://venturebeat.com/2016/06/30/facebook-kicked-zynga-to-the-curb-publishers-are-next/">famously did to Zynga</a> and Google <a href="https://www.nytimes.com/2017/07/01/technology/yelp-google-european-union-antitrust.html">did to Yelp</a>.</p>
<p>The idea that corporate-owned services could be replaced by community-owned services may sound far-fetched, but there is a strong historical precedent in the transformation of software over the past twenty years. In the 1990s, computing was dominated by proprietary, closed-source software, most notably Windows. Today, billions of Android phones run on the open source operating system Linux. Much of the software running on an Apple device is open source, as is almost all modern cloud data centers including Amazon’s. The recent acquisitions of <a href="https://www.wired.com/story/microsofts-github-deal-is-its-latest-shift-from-windows/">Github by Microsoft</a> and <a href="https://www.wired.com/story/ibm-buying-open-source-specialist-red-hat-34-billion/">Red Hat by IBM</a> underscore how dominant open source has become.</p>
<p>As open source has grown in importance, technology companies have shifted their business models from selling software to delivering cloud-based services. Google, Facebook, Amazon, and Netflix are all services companies. Even Microsoft is now primarily a services company. This has allowed these companies to outpace the growth of open source software and maintain control of critical internet infrastructure.</p>
<p>A core insight in the design of blockchains is that the open source model can be extended beyond software to cloud-based services by adding financial incentives to the mix. Cryptocurrencies—coins and tokens built into specific blockchains—provide a way to incentivize individuals and groups to participate in, maintain, and build services.</p>
<p>The idea that an internet service could have an associated coin or token may be a novel concept, but the blockchain and cryptocurrencies can do for cloud-based services what open source did for software. It took twenty years for open source software to supplant proprietary software, and it could take just as long for open services to supplant proprietary services. But the benefits of such a shift will be immense. Instead of placing our trust in corporations, we can place our trust in community-owned and -operated software, transforming the internet’s governing principle from “don’t be evil” back to “can’t be evil.”</p>
]]></content:encoded>
</item>
<item>
<title>Why decentralization matters</title>
<link>https://cdixon.org/2018/02/18/why-decentralization-matters/</link>
<guid>https://cdixon.org/2018/02/18/why-decentralization-matters/</guid>
<pubDate>Sun, 18 Feb 2018 00:00:00 GMT</pubDate>
<description>The first two eras of the internet During the first era of the internet — from the 1980s through the early 2000s — internet services were built on open protocols ...</description>
<content:encoded><![CDATA[<h2>The first two eras of the internet</h2>
<p>During the first era of the internet — from the 1980s through the early 2000s — internet services were built on open protocols that were controlled by the internet community. This meant that people or organizations could grow their internet presence knowing the rules of the game wouldn’t change later on. Huge web properties were started during this era including Yahoo, Google, Amazon, Facebook, LinkedIn, and YouTube. In the process, the importance of centralized platforms like AOL greatly diminished.</p>
<p>During the second era of the internet, from the mid 2000s to the present, for-profit tech companies — most notably Google, Apple, Facebook, and Amazon (GAFA) — built software and services that rapidly outpaced the capabilities of open protocols. The explosive growth of smartphones accelerated this trend as mobile apps became the majority of internet use. Eventually users migrated from open services to these more sophisticated, centralized services. Even when users still accessed open protocols like the web, they would typically do so mediated by GAFA software and services.</p>
<p>The good news is that billions of people got access to amazing technologies, many of which were free to use. The bad news is that it became much harder for startups, creators, and other groups to grow their internet presence without worrying about centralized platforms changing the rules on them, taking away their audiences and profits. This in turn stifled innovation, making the internet less interesting and dynamic. Centralization has also created broader societal tensions, which we see in the debates over subjects like fake news, state sponsored bots, “no platforming” of users, EU privacy laws, and algorithmic biases. These debates will only intensify in the coming years.</p>
<h2>“Web 3”: the third era of the internet</h2>
<p>One response to this centralization is to impose government regulation on large internet companies. This response assumes that the internet is similar to past communication networks like the phone, radio, and TV networks. But the hardware-based networks of the past are fundamentally different than the internet, a software-based network. Once hardware-based networks are built, they are nearly impossible to rearchitect. Software-based networks can be rearchitected through entrepreneurial innovation and market forces.</p>
<p>The internet is the ultimate software-based network, consisting of a relatively simple <a href="https://en.wikipedia.org/wiki/Internet_Protocol">core layer</a> connecting billions of fully programmable computers at the edge. Software is simply the encoding of human thought, and as such has an almost unbounded design space. Computers connected to the internet are, by and large, free to run whatever software their owners choose. Whatever can be dreamt up, with the right set of incentives, can quickly propagate across the internet. Internet architecture is where technical creativity and incentive design intersect.</p>
<p>The internet is still early in its evolution: the core internet services will likely be almost entirely rearchitected in the coming decades. This will be enabled by crypto-economic networks, a generalization of the ideas first introduced in <a href="https://bitcoin.org/bitcoin.pdf">Bitcoin</a> and further developed in <a href="https://github.com/ethereum/wiki/wiki/White-Paper">Ethereum</a>. Cryptonetworks combine the best features of the first two internet eras: community-governed, decentralized networks with capabilities that will eventually exceed those of the most advanced centralized services.</p>
<h2>Why decentralization?</h2>
<p>Decentralization is a commonly misunderstood concept. For example, it is sometimes said that the reason cryptonetwork advocates favor decentralization is to resist government censorship, or because of libertarian political views. These are not the main reasons decentralization is important.</p>
<p>Let’s look at the problems with centralized platforms. Centralized platforms follow a predictable life cycle. When they start out, they do everything they can to recruit users and 3rd-party complements like developers, businesses, and media organizations. They do this to make their services more valuable, as platforms (by definition) are systems with multi-sided network effects. As platforms move up the adoption S-curve, their power over users and 3rd parties steadily grows.</p>
<p><img src="images/07lrwGIDbAYk6q7zG.png" alt=""></p>
<p>When they hit the top of the S-curve, their relationships with network participants change from positive-sum to zero-sum. The easiest way to continue growing lies in extracting data from users and competing with complements over audiences and profits. Historical examples of this are Microsoft vs. Netscape, Google vs. Yelp, Facebook vs. Zynga, and Twitter vs. its 3rd-party clients. Operating systems like iOS and Android have behaved better, although still take a healthy 30% tax, reject apps for seemingly arbitrary reasons, and subsume the functionality of 3rd-party apps at will.</p>
<p>For 3rd parties, this transition from cooperation to competition feels like a bait-and-switch. Over time, the best entrepreneurs, developers, and investors have become wary of building on top of centralized platforms. We now have decades of evidence that doing so will end in disappointment. In addition, users give up privacy, control of their data, and become vulnerable to security breaches. These problems with centralized platforms will likely become even more pronounced in the future.</p>
<h2>Enter cryptonetworks</h2>
<p>Cryptonetworks are networks built on top of the internet that 1) use consensus mechanisms such as blockchains to maintain and update state, 2) use cryptocurrencies (coins/tokens) to incentivize consensus participants (miners/validators) and other network participants. Some cryptonetworks, such as Ethereum, are general programming platforms that can be used for almost any purpose. Other cryptonetworks are special purpose, for example Bitcoin is intended primarily for storing value, <a href="https://golem.network/">Golem</a> for performing computations, and <a href="https://filecoin.io/">Filecoin</a> for decentralized file storage.</p>
<p>Early internet protocols were technical specifications created by working groups or non-profit organizations that relied on the alignment of interests in the internet community to gain adoption. This method worked well during the very early stages of the internet but since the early 1990s very few new protocols have gained widespread adoption. <a href="2017/05/27/crypto-tokens-a-breakthrough-in-open-network-design">Cryptonetworks fix</a> these problems by providing economics incentives to developers, maintainers, and other network participants in the form of tokens. They are also much more technically robust. For example, they are able to keep state and do arbitrary transformations on that state, something past protocols could never do.</p>
<p>Cryptonetworks use multiple mechanisms to ensure that they stay neutral as they grow, preventing the bait-and-switch of centralized platforms. First, the contract between cryptonetworks and their participants is enforced in open source code. Second, they are kept in check through mechanisms for <a href="https://en.wikipedia.org/wiki/Exit,_Voice,_and_Loyalty">“voice” and “exit.”</a> Participants are given voice through community governance, both “on chain” (via the protocol) and “off chain” (via the social structures around the protocol). Participants can exit either by leaving the network and selling their coins, or in the extreme case by forking the protocol.</p>
<p>In short, cryptonetworks align network participants to work together toward a common goal — the growth of the network and the appreciation of the token. This alignment is one of the main reasons Bitcoin continues to defy skeptics and flourish, even while new cryptonetworks like Ethereum have grown alongside it.</p>
<p>Today’s cryptonetworks suffer from limitations that keep them from seriously challenging centralized incumbents. The most severe limitations are around performance and scalability. The next few years will be about fixing these limitations and building networks that form the infrastructure layer of the crypto stack. After that, most of the energy will turn to building applications on top of that infrastructure.</p>
<h2>How decentralization wins</h2>
<p>It’s one thing to say decentralized networks should win, and another thing to say they will win. Let’s look at specific reasons to be optimistic about this.</p>
<p>Software and web services are built by developers. There are millions of highly skilled developers in the world. Only a small fraction work at large technology companies, and only a small fraction of those work on new product development. Many of the most important software projects in history were created by startups or by communities of independent developers.</p>
<blockquote>
<p>“No matter who you are, most of the smartest people work for someone else.” — <a href="https://en.wikipedia.org/wiki/Joy%27s_law_(management)">Bill Joy</a></p>
</blockquote>
<p>Decentralized networks can win the third era of the internet for the same reason they won the first era: by winning the hearts and minds of entrepreneurs and developers.</p>
<p>An illustrative analogy is the rivalry in the 2000s between Wikipedia and its centralized competitors like Encarta. If you compared the two products in the early 2000s, Encarta was a far better product, with better topic coverage and higher accuracy. But Wikipedia improved at a much faster rate, because it had an active community of volunteer contributors who were attracted to its decentralized, community-governed ethos. By 2005, Wikipedia was the most <a href="https://medium.com/@cdixon/it-s-hard-to-believe-today-but-10-years-ago-wikipedia-was-widely-considered-a-doomed-experiment-a7a0dfd27b8b">popular</a> reference site on the internet. Encarta was shut down in 2009.</p>
<p>The lesson is that when you compare centralized and decentralized systems you need to consider them dynamically, as processes, instead of statically, as rigid products. Centralized systems often start out fully baked, but only get better at the rate at which employees at the sponsoring company improve them. Decentralized systems start out half-baked but, under the right conditions, grow exponentially as they attract new contributors.</p>
<p>In the case of cryptonetworks, there are multiple, compounding feedback loops involving developers of the core protocol, developers of complementary cryptonetworks, developers of 3rd party applications, and service providers who operate the network. These feedback loops are further amplified by the incentives of the associated token, which — as we’ve seen with Bitcoin and Ethereum — can supercharge the rate at which crypto communities develop (and sometimes lead to negative outcomes, as with the excessive electricity consumed by Bitcoin mining).</p>
<p>The question of whether decentralized or centralized systems will win the next era of the internet reduces to who will build the most compelling products, which in turn reduces to who will get more high quality developers and entrepreneurs on their side. GAFA has many advantages, including cash reserves, large user bases, and operational infrastructure. Cryptonetworks have a significantly more attractive value proposition to developers and entrepreneurs. If they can win their hearts and minds, they can mobilize far more resources than GAFA, and rapidly outpace their product development.</p>
<blockquote>
<p>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are linked using hypertext.” — <a href="http://farmerandfarmer.org/mastery/builder.html">Farmer & Farmer</a></p>
</blockquote>
<p>Centralized platforms often come bundled at launch with compelling apps: Facebook had its core socializing features and the iPhone had a number of key apps. Decentralized platforms, by contrast, often launch half-baked and without clear use cases. As a result, they need to go through two phases of product-market fit: 1) product-market fit between the platform and the developers/entrepreneurs who will finish the platform and build out the ecosystem, and 2) product-market fit between the platform/ecosystem and end users. This two-stage process is what causes many people — including sophisticated technologists — to consistently underestimate the potential of decentralized platforms.</p>
<h2>The next era of the internet</h2>
<p>Decentralized networks aren’t a silver bullet that will fix all the problems on the internet. But they offer a much better approach than centralized systems.</p>
<p>Compare the problem of Twitter spam to the problem of email spam. Since Twitter <a href="https://www.theverge.com/2012/8/23/3263481/twitter-api-third-party-developers">closed</a> their network to 3rd-party developers, the only company working on Twitter spam has been Twitter itself. By contrast, there were hundreds of companies that tried to fight email spam, financed by billions of dollars in venture capital and corporate funding. Email spam isn’t solved, but it’s a lot better now, because 3rd parties knew that the <a href="https://en.wikipedia.org/wiki/Simple_Mail_Transfer_Protocol">email protocol</a> was decentralized, so they could build businesses on top of it without worrying about the rules of the game changing later on.</p>
<p>Or consider the problem of network governance. Today, unaccountable groups of employees at large platforms decide how information gets ranked and filtered, which users get promoted and which get banned, and other important governance decisions. In cryptonetworks, these decisions are made by the community, using open and transparent mechanisms. As we know from the offline world, democratic systems aren’t perfect, but they are a lot better than the alternatives.</p>
<p>Centralized platforms have been dominant for so long that many people have forgotten there is a better way to build internet services. Cryptonetworks are a powerful way to develop community-owned networks and provide a level playing field for 3rd-party developers, creators, and businesses. We saw the value of decentralized systems in the first era of the internet. Hopefully we’ll get to see it again in the next.</p>
<p><em>Originally published on <a href="https://medium.com/s/story/why-decentralization-matters-5e3f79f7638e">Medium</a>.</em></p>
]]></content:encoded>
</item>
<item>
<title>Tokens: A Breakthrough in Open Network Design</title>
<link>https://cdixon.org/2017/05/27/crypto-tokens-a-breakthrough-in-open-network-design/</link>
<guid>https://cdixon.org/2017/05/27/crypto-tokens-a-breakthrough-in-open-network-design/</guid>
<pubDate>Sat, 27 May 2017 00:00:00 GMT</pubDate>
<description>It is a wonderful accident of history that the internet and web were created as open platforms that anyone — users, developers, organizations — could access equally. Among other things, ...</description>
<content:encoded><![CDATA[<p>It is a wonderful accident of history that the internet and web were created as open platforms that anyone — users, developers, organizations — could access equally. Among other things, this allowed independent developers to build products that quickly gained widespread adoption. Google started in a Menlo Park garage and Facebook started in a Harvard dorm room. They competed on a level playing field because they were built on decentralized networks governed by open protocols.</p>
<p>Today, tech companies like Facebook, Google, Amazon, and Apple are <a href="https://medium.com/@cdixon/the-internet-economy-fc43f3eff58a">stronger</a> than ever, whether measured by <a href="http://www.visualcapitalist.com/chart-largest-companies-market-cap-15-years/">market cap</a>, share of top mobile apps, or pretty much any other common measure.</p>
<p><img src="images/11LduvqPVCAVsy-rQ2qlhvg.png" alt="Big 4 tech companies dominate smartphone apps (source); while their market caps continue to rise (source)"></p>
<p>These companies also control massive proprietary developer platforms. The dominant operating systems — iOS and Android — charge 30% payment fees and exert heavy influence over app distribution. The dominant social networks tightly restrict access, hindering the ability of third-party developers to scale. Startups and independent developers are increasingly competing from a disadvantaged position.</p>
<p>A potential way to reverse this trend are <a href="http://continuations.com/post/148098927445/crypto-tokens-and-the-coming-age-of-protocol">crypto tokens</a> — a new way to design open networks that arose from the cryptocurrency movement that began with the introduction of Bitcoin in 2008 and accelerated with the introduction of Ethereum in 2014. Tokens are a breakthrough in open network design that enable: 1) the creation of open, decentralized networks that combine the best architectural properties of open and proprietary networks, and 2) new ways to incentivize open network participants, including users, developers, investors, and service providers. By enabling the development of new open networks, tokens could help reverse the centralization of the internet, thereby keeping it accessible, vibrant and fair, and resulting in greater innovation.</p>
<h2>Crypto tokens: unbundling Bitcoin</h2>
<p>Bitcoin was introduced in 2008 with the publication of <a href="https://en.wikipedia.org/wiki/Satoshi_Nakamoto">Satoshi Nakamoto’s</a> landmark <a href="https://bitcoin.org/bitcoin.pdf">paper</a> that proposed a novel, decentralized payment system built on an underlying technology now known as a <a href="https://en.wikipedia.org/wiki/Blockchain">blockchain</a>. Most fans of Bitcoin (including <a href="/2013/12/31/why-im-interested-in-bitcoin/">me</a>) mistakenly thought Bitcoin was solely a breakthrough in financial technology. (It was easy to make this mistake: Nakamoto himself called it a “p2p payment system.”)</p>
<p><img src="images/1MQ68XZTGHQG7E6ut5UimEw.jpeg" alt="2009: Satoshi Nakamoto’s (post) announcing Bitcoin"></p>
<p>In retrospect, Bitcoin was really two innovations: 1) a <a href="https://en.wikipedia.org/wiki/Store_of_value">store of value</a> for people who wanted an alternative to the existing financial system, and 2) a new way to develop open networks. Tokens unbundle the latter innovation from the former, providing a general method for designing and growing open networks.</p>
<p>Networks — computing networks, developer platforms, marketplaces, social networks, etc — have always been a powerful part of the promise of the internet. Tens of thousands of networks have been incubated by developers and entrepreneurs, yet only a very small percentage of those have survived, and most of those were owned and controlled by private companies. The current state of the art of network development is very crude. It often involves raising money (venture capital is a common source of funding) and then spending it on paid marketing and other channels to overcome the “bootstrap problem” — the problem that networks tend to only become useful when they reach a critical mass of users. In the rare cases where networks succeed, the financial returns tend to accrue to the relatively small number of people who own equity in the network. Tokens offer a better way.</p>
<p>Ethereum, introduced in 2014 and launched in 2015, was the first major non-Bitcoin token network. The lead developer, <a href="https://a16z.com/2016/08/28/ethereum/">Vitalik Buterin</a>, had previously tried to create smart contract languages on top of the Bitcoin blockchain. Eventually he realized that (by design, mostly) Bitcoin was too limited, so a new approach was needed.</p>
<p><img src="images/1Crmcqo6mdF1okzHt4Bdp4g.png" alt="2014: Vitalik Buterin’s (forum post) announcing Ethereum"></p>
<p>Ethereum is a network that allows developers to run “smart contracts” — snippets of <a href="https://en.wikipedia.org/wiki/Ethereum#Smart_contracts">code</a> submitted by developers that are executed by a distributed network of computers. Ethereum has a corresponding token called Ether that can be purchased, either to hold for financial purposes or to use by purchasing computing power (known as “<a href="https://ethereum.stackexchange.com/questions/3/what-is-gas-and-transaction-fee-in-ethereum">gas</a>”) on the network. Tokens are also given out to “miners” which are the computers on the decentralized network that execute smart contract code (you can think of miners as playing the role of cloud hosting services like <a href="https://en.wikipedia.org/wiki/Amazon_Web_Services">AWS</a>). Third-party developers can write their own <a href="https://dapps.ethercasts.com/">applications</a> that live on the network, and can charge Ether to generate revenue.</p>
<p>Ethereum is inspiring a new wave of token networks. (It also provided a simple way for new token networks to launch on top of the Ethereum network, using a standard known as <a href="https://github.com/ethereum/EIPs/issues/20">ERC20</a>). Developers are building token networks for a wide range of use cases, including distributed <a href="http://filecoin.io/">computing</a> <a href="https://golem.network/">platforms</a>, <a href="https://augur.net/">prediction</a> and financial markets, incentivized <a href="https://steem.io/">content creation networks</a>, and <a href="https://basicattentiontoken.org/">attention and advertising networks</a>. Many more networks will be invented and launched in the coming months and years.</p>
<p>Below I walk through the two main benefits of the token model, the first architectural and the second involving incentives.</p>
<h2>Tokens enable the management and financing of open services</h2>
<p>Proponents of open systems never had an effective way to manage and fund operating services, leading to a significant architectural disadvantage compared to their proprietary counterparts. This was particularly evident during the last internet mega-battle between open and closed networks: the social wars of the late 2000s. As Alexis Madrigal recently <a href="https://www.theatlantic.com/technology/archive/2017/05/a-very-brief-history-of-the-last-10-years-in-technology/526767/?utm_source=atltw">wrote</a>, back in 2007 it looked like open networks would dominate going forward:</p>
<blockquote>
<p>In 2007, the web people were triumphant. Sure, the dot-com boom had busted, but empires were being built out of the remnant swivel chairs and fiber optic cables and unemployed developers. Web 2.0 was not just a temporal description, but an ethos. The web would be open. A myriad of services would be built, communicating through APIs, to provide the overall internet experience.</p>
</blockquote>
<p>But with the launch of the iPhone and the rise of smartphones, proprietary networks quickly won out:</p>
<blockquote>
<p>As that world-historical explosion began, a platform war came with it. The Open Web lost out quickly and decisively. By 2013, Americans spent about as much of their time on their phones <a href="http://www.marketingcharts.com/online/smart-device-users-spend-as-much-time-on-facebook-as-the-mobile-web-28422/">looking at Facebook</a> as they did the whole rest of the open web.</p>
</blockquote>
<p>Why did open social protocols get so decisively defeated by proprietary social networks? The rise of smartphones was only part of the story. Some open protocols — like email and the web — survived the transition to the mobile era. Open protocols relating to social networks were high quality and abundant (e.g. <a href="https://en.wikipedia.org/wiki/RSS">RSS</a>, <a href="http://xmlns.com/foaf/spec/">FOAF</a>, <a href="https://en.wikipedia.org/wiki/XHTML_Friends_Network">XFN</a>, <a href="http://openid.net/">OpenID</a>). What the open side lacked was a mechanism for encapsulating software, databases, and protocols together into easy-to-use services.</p>
<p>For example, in 2007, Wired magazine ran an <a href="https://www.wired.com/2007/08/open-social-net/">article</a> in which they tried to create their own social network using open tools:</p>
<blockquote>
<p>For the last couple of weeks, Wired News tried to roll its own Facebook using free web tools and widgets. We came close, but we ultimately failed. We were able to recreate maybe 90 percent of Facebook’s functionality, but not the most important part — a way to link people and declare the nature of the relationship.</p>
</blockquote>
<p>Some developers <a href="http://bradfitz.com/social-graph-problem/">proposed</a> solving this problem by creating a database of social graphs run by a non-profit organization:</p>
<blockquote>
<p><strong>Establish a non-profit and open source software</strong> (with copyrights held by the non-profit) which collects, merges, and redistributes the graphs from all other social network sites into one global aggregated graph. This is then made available to other sites (or users) via both public APIs (for small/casual users) and downloadable data dumps, with an update stream / APIs, to get iterative updates to the graph (for larger users).</p>
</blockquote>
<p>These open schemes required widespread coordination among standards bodies, server operators, app developers, and sponsoring organizations to mimic the functionality that proprietary services could provide all by themselves. As a result, proprietary services were able to create better user experiences and iterate much faster. This led to faster growth, which in turn led to greater investment and revenue, which then fed back into product development and further growth. Thus began a flywheel that drove the meteoric rise of proprietary social networks like Facebook and Twitter.</p>
<p>Had the token model for network development existed back in 2007, the playing field would have been much more level. First, tokens provide a way not only to define a protocol, but to fund the operating expenses required to host it as a service. Bitcoin and Ethereum have tens of thousands of servers around the world (“miners”) that run their networks. They cover the hosting costs with built-in mechanisms that automatically distribute token rewards to computers on the network (“mining rewards”).</p>
<p><img src="images/1-lu1cuJeeDIFPsDpPPo8lw.png" alt="There are over 20,000 Ethereum nodes around the world (source)"></p>
<p>Second, tokens provide a model for creating shared computing resources (<a href="https://medium.com/@FEhrsam/the-dapp-developer-stack-the-blockchain-industry-barometer-8d55ec1c7d4">including</a> databases, compute, and file storage) while keeping the control of those resources decentralized (and without requiring an organization to maintain them). This is the blockchain technology that has been talked about <a href="https://trends.google.com/trends/explore?q=blockchain">so much</a>. Blockchains would have allowed shared social graphs to be stored on a decentralized network. It would have been easy for the Wired author to create an open social network using the tools available today.</p>
<h2>Tokens align incentives among network participants</h2>
<p>Some of the <a href="/2009/09/14/the-inevitable-showdown-between-twitter-and-twitter-apps/">fiercest battles</a> in tech are between <a href="https://en.wikipedia.org/wiki/Complementary_good">complements</a>. There were, for example, hundreds of startups that tried to build businesses on the APIs of social networks only to have the terms change later on, forcing them to pivot or shut down. Microsoft’s battles with complements like Netscape and Intuit are legendary. Battles within ecosystems are so common and drain so much energy that business books are full of frameworks for how one company can squeeze profits from adjacent businesses (e.g. Porter’s <a href="https://en.wikipedia.org/wiki/Porter%27s_five_forces_analysis">five forces</a> model).</p>
<p>Token networks remove this friction by aligning network participants to work together toward a common goal— the growth of the network and the appreciation of the token. This alignment is one of the main reasons Bitcoin continues to defy <a href="https://99bitcoins.com/bitcoinobituaries/">skeptics</a> and flourish, even while new token networks like Ethereum have grown along side it.</p>
<p>Moreover, well-designed token networks include an efficient mechanism to incentivize network participants to overcome the bootstrap problem that bedevils traditional network development. For example, <a href="https://steemit.com/">Steemit</a> is a decentralized Reddit-like token network that makes payments to users who post and upvote articles. When Steemit launched last year, the community was <a href="https://coinreport.net/social-network-steemit-distributes-1-3-million-first-cryptocurrency-payout-users/">pleasantly surprised</a> when they made their first significant payout to users.</p>
<p><img src="images/1mi0v6PNlGnjL9QH-AWZxAA.png" alt="Tokens help overcome the bootstrap problem by adding financial utility when application utility is low"></p>
<p>This in turn led to the appreciation of Steemit tokens, which increased future payouts, leading to a <a href="https://www.usv.com/blog/fat-protocols">virtuous cycle</a> where more users led to more investment, and vice versa. Steemit is still a beta project and has since had mixed results, but was an interesting experiment in how to generalize the mutually reinforcing interaction between users and investors that Bitcoin and Ethereum first demonstrated.</p>
<p>A lot of attention has been paid to token pre-sales (so-called “ICOs”), but they are just one of multiple ways in which the token model innovates on network incentives. A well-designed token network carefully manages the distribution of tokens across all five groups of network participants (users, core developers, third-party developers, investors, service providers) to maximize the growth of the network.</p>
<p>One way to think about the token model is to imagine if the internet and web hadn’t been funded by governments and universities, but instead by a company that raised money by selling off domain names. People could buy domain names either to use them or as an investment (collectively, domain names are worth tens of billions of dollars today). Similarly, domain names could have been given out as rewards to service providers who agreed to run hosting services, and to third-party developers who supported the network. This would have provided an alternative way to finance and accelerate the development of the internet while also aligning the incentives of the various network participants.</p>
<h2>The open network movement</h2>
<p>The cryptocurrency movement is the spiritual heir to previous open computing movements, including the open source software movement led most visibly by Linux, and the open information movement led most visibly by Wikipedia.</p>
<p><img src="images/1U0B5FlpNVXSXeIcqodktLQ.png" alt="1991: Linus Torvalds’ forum (post) announcing Linux; 2001: the first Wikipedia (page)"></p>
<p>Both of these movements were once niche and <a href="https://medium.com/@cdixon/it-s-hard-to-believe-today-but-10-years-ago-wikipedia-was-widely-considered-a-doomed-experiment-a7a0dfd27b8b">controversial</a>. Today Linux is the dominant worldwide operating system, and Wikipedia is the most popular informational website in the world.</p>
<p>Crypto tokens are currently niche and controversial. If present trends continue, they will soon be seen as a breakthrough in the design and development of open networks, combining the societal benefits of open protocols with the financial and architectural benefits of proprietary networks. They are also an extremely promising development for those hoping to keep the internet accessible to entrepreneurs, developers, and other independent creators.</p>
]]></content:encoded>
</item>
<item>
<title>How Aristotle Created the Computer</title>
<link>https://cdixon.org/2017/02/20/aristotle-computer/</link>
<guid>https://cdixon.org/2017/02/20/aristotle-computer/</guid>
<pubDate>Mon, 20 Feb 2017 00:00:00 GMT</pubDate>
<description>The philosophers he influenced set the stage for the technological revolution that remade our world. Originally published by The Atlantic. The history of computers is often told as a history ...</description>
<content:encoded><![CDATA[<h2>The philosophers he influenced set the stage for the technological revolution that remade our world.</h2>
<p><em>Originally published by <a href="https://www.theatlantic.com/technology/archive/2017/03/aristotle-computer/518697/">The Atlantic</a>.</em></p>
<p>The history of computers is often told as a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II. In fact, it is better understood as a history of ideas, mainly ideas that emerged from mathematical logic, an obscure and cult-like discipline that first developed in the 19th century. Mathematical logic was pioneered by philosopher-mathematicians, most notably George Boole and Gottlob Frege, who were themselves inspired by Leibniz’s dream of a universal “concept language,” and the ancient logical system of Aristotle.</p>
<p>Mathematical logic was initially considered a hopelessly abstract subject with no conceivable applications. As one computer scientist <a href="http://bactra.org/notebooks/mathematical-logic.html">commented</a>: “If, in 1901, a talented and sympathetic outsider had been called upon to survey the sciences and name the branch which would be least fruitful in [the] century ahead, his choice might well have settled upon mathematical logic.” And yet, it would provide the foundation for a field that would have more impact on the modern world than any other.</p>
<p>The evolution of computer science from mathematical logic culminated in the 1930s, with two landmark papers: Claude Shannon’s “<a href="http://www.ccapitalia.net/descarga/docs/1938-shannon-analysis-relay-switching-circuits.pdf">A Symbolic Analysis of Switching and Relay Circuits</a>,” and Alan Turing’s “<a href="http://www.dna.caltech.edu/courses/cs129/caltech_restricted/Turing_1936_IBID.pdf">On Computable Numbers, With an Application to the <em>Entscheidungsproblem</em></a>.” In the history of computer science, Shannon and Turing are towering figures, but the importance of the philosophers and logicians who preceded them is frequently overlooked.</p>
<p>A well-known history of computer science describes Shannon’s paper as “possibly the most important, and also the most noted, master’s thesis of the century.” Shannon wrote it as an electrical engineering student at MIT. His adviser, Vannevar Bush, built a prototype computer known as the <a href="http://www.mit.edu/~klund/analyzer/">Differential Analyzer</a> that could rapidly calculate differential equations. The device was mostly mechanical, with subsystems controlled by electrical relays, which were organized in an ad hoc manner as there was not yet a systematic theory underlying circuit design. Shannon’s thesis topic came about when Bush recommended he try to discover such a theory.</p>
<p>Shannon’s paper is in many ways a typical electrical-engineering paper, filled with equations and diagrams of electrical circuits. What is unusual is that the primary reference was a 90-year-old work of mathematical philosophy, George Boole’s <em>The Laws of Thought</em>.</p>
<p>Today, Boole’s name is well known to computer scientists (many programming languages have a basic data type called a Boolean), but in 1938 he was rarely read outside of philosophy departments. Shannon himself encountered Boole’s work in an undergraduate philosophy class. “It just happened that no one else was familiar with both fields at the same time,” he <a href="http://georgeboole.com/boole/legacy/engineering/">commented</a> later.</p>
<p>Boole is often described as a mathematician, but he saw himself as a philosopher, following in the footsteps of Aristotle. The Laws of Thought begins with a description of his goals, to investigate the fundamental laws of the operation of the human mind:</p>
<blockquote>
<p>The design of the following treatise is to investigate the fundamental laws of those operations of the mind by which reasoning is performed; to give expression to them in the symbolical language of a Calculus, and upon this foundation to establish the science of Logic … and, finally, to collect … some probable intimations concerning the nature and constitution of the human mind.</p>
</blockquote>
<p>He then pays tribute to Aristotle, the inventor of logic, and the primary influence on <a href="http://www.gutenberg.org/files/15114/15114-pdf.pdf">his own work</a>:</p>
<blockquote>
<p>In its ancient and scholastic form, indeed, the subject of Logic stands almost exclusively associated with the great name of Aristotle. As it was presented to ancient Greece in the partly technical, partly metaphysical disquisitions of The Organon, such, with scarcely any essential change, it has continued to the present day.</p>
</blockquote>
<p>Trying to improve on the logical work of Aristotle was an intellectually daring move. Aristotle’s logic, presented in his six-part book <em>The Organon</em>, occupied a central place in the scholarly canon for more than 2,000 years. It was widely believed that Aristotle had written almost all there was to say on the topic. The great philosopher Immanuel Kant <a href="https://books.google.com/books?id=WJVYp0C0taYC&pg=PA36&lpg=PA36&dq=unable+to+take+a+single+step+forward,+and+therefore+seems+to+all+appearance+to+be+finished+and+complete&source=bl&ots=W4Lrt9I80J&sig=KpZlOd-Yc9brgTksIJJZcxUD-Mg&hl=en&sa=X&ved=0ahUKEwjeg8i1iLvQAhVH6IMKHTMXDMgQ6AEIHTAA#v=onepage&q=unable%20to%20take%20a%20single%20step%20forward%2C%20and%20therefore%20seems%20to%20all%20appearance%20to%20be%20finished%20and%20complete&f=false">commented</a> that, since Aristotle, logic had been “unable to take a single step forward, and therefore seems to all appearance to be finished and complete.”</p>
<p>Aristotle’s central observation was that arguments were valid or not based on their logical structure, independent of the non-logical words involved. The most famous argument schema he discussed is known as the syllogism:</p>
<ul>
<li>All men are mortal.</li>
<li>Socrates is a man.</li>
<li>Therefore, Socrates is mortal.</li>
</ul>
<p>You can replace “Socrates” with any other object, and “mortal” with any other predicate, and the argument remains valid. The validity of the argument is determined solely by the logical structure. The logical words — “all,” “is,” are,” and “therefore” — are doing all the work.</p>
<p>Aristotle also defined a set of basic axioms from which he derived the rest of his logical system:</p>
<ul>
<li>An object is what it is (Law of Identity)</li>
<li>No statement can be both true and false (Law of Non-contradiction)</li>
<li>Every statement is either true or false (Law of the Excluded Middle)</li>
</ul>
<p>These axioms weren’t meant to describe how people actually think (that would be the realm of psychology), but how an idealized, perfectly rational person ought to think.</p>
<p>Aristotle’s axiomatic method influenced an even more famous book, Euclid’s <em>Elements</em>, which is <a href="https://en.wikipedia.org/wiki/Euclid%27s_Elements">estimated</a> to be second only to the Bible in the number of editions printed.</p>
<p><img src="images/2c8ad9d68.png" alt="A fragment of the Elements (Wikimedia Commons)"></p>
<p>Although ostensibly about geometry, the <em>Elements</em> became a standard textbook for teaching rigorous deductive reasoning. (Abraham Lincoln once said that he learned sound legal argumentation from studying Euclid.) In Euclid’s system, geometric ideas were represented as spatial diagrams. Geometry continued to be practiced this way until René Descartes, in the 1630s, showed that geometry could instead be represented as formulas. His <em>Discourse on Method</em> was the <a href="http://www.storyofmathematics.com/17th_descartes.html">first</a> mathematics text in the West to popularize what is now standard algebraic notation — x, y, z for variables, a, b, c for known quantities, and so on.</p>
<p>Descartes’s algebra allowed mathematicians to move beyond spatial intuitions to manipulate symbols using precisely defined formal rules. This shifted the dominant mode of mathematics from diagrams to formulas, leading to, among other things, the development of calculus, invented roughly 30 years after Descartes by, independently, Isaac Newton and Gottfried Leibniz.</p>
<p>Boole’s goal was to do for Aristotelean logic what Descartes had done for Euclidean geometry: free it from the limits of human intuition by giving it a precise algebraic notation. To give a simple example, when Aristotle wrote:</p>
<p>All men are mortal.</p>
<p>Boole replaced the words “men” and “mortal” with variables, and the logical words “all” and “are” with arithmetical operators:</p>
<p><em>x = x * y</em></p>
<p>Which could be interpreted as “Everything in the set <em>x</em> is also in the set <em>y</em>.”</p>
<p>The <em>Laws of Thought</em> created a new scholarly field—mathematical logic—which in the following years became one of the most active areas of research for mathematicians and philosophers. Bertrand Russell called the <em>Laws of Thought</em> “the work in which pure mathematics was discovered.”</p>
<p>Shannon’s insight was that Boole’s system could be mapped directly onto electrical circuits. At the time, electrical circuits had no systematic theory governing their design. Shannon realized that the right theory would be “exactly analogous to the calculus of propositions used in the symbolic study of logic.”</p>
<p>He showed the correspondence between electrical circuits and Boolean operations in a simple chart:</p>
<p><img src="images/99df968e4.png" alt="Shannon’s mapping from electrical circuits to symbolic logic (University of Virginia)"></p>
<p>This correspondence allowed computer scientists to import decades of work in logic and mathematics by Boole and subsequent logicians. In the second half of his paper, Shannon showed how Boolean logic could be used to create a circuit for adding two binary digits.</p>
<p>By stringing these adder circuits together, arbitrarily complex arithmetical operations could be constructed. These circuits would become the basic building blocks of what are now known as <a href="https://en.wikipedia.org/wiki/Arithmetic_logic_unit">arithmetical logic units</a>, a key component in modern computers.</p>
<p><img src="images/2b88e5a1a.png" alt="Shannon’s adder circuit (University of Virginia)"></p>
<p>Another way to characterize Shannon’s achievement is that he was first to distinguish between the logical and the physical layer of computers. (This distinction has become so fundamental to computer science that it might seem surprising to modern readers how insightful it was at the time—a reminder of the adage that “the philosophy of one century is the common sense of the next.”)</p>
<p>Since Shannon’s paper, a vast amount of progress has been made on the physical layer of computers, including the invention of the transistor in 1947 by William Shockley and his colleagues at Bell Labs. Transistors are dramatically improved versions of Shannon’s electrical relays — the best known way to physically encode Boolean operations. Over the next 70 years, the semiconductor industry packed more and more transistors into smaller spaces. A 2016 iPhone <a href="http://www.macrumors.com/2016/09/12/cpu-improvements-iphone-7-apple-watch/">has</a> about 3.3 billion transistors, each one a “relay switch” like those pictured in Shannon’s diagrams.</p>
<p>While Shannon showed how to map logic onto the physical world, Turing showed how to design computers in the language of mathematical logic. When Turing wrote his paper, in 1936, he was trying to solve “the decision problem,” first identified by the mathematician David Hilbert, who asked whether there was an algorithm that could determine whether an arbitrary mathematical statement is true or false. In contrast to Shannon’s paper, Turing’s paper is highly technical. Its primary historical significance lies not in its answer to the decision problem, but in the template for computer design it provided along the way.</p>
<p>Turing was working in a tradition stretching back to Gottfried Leibniz, the philosophical giant who developed calculus independently of Newton. Among Leibniz’s many contributions to modern thought, one of the most intriguing was the idea of a new language he called the “<a href="https://en.wikipedia.org/wiki/Characteristica_universalis">universal characteristic</a>” that, he imagined, could represent all possible mathematical and scientific knowledge. Inspired in part by the 13th-century religious philosopher <a href="https://en.wikipedia.org/wiki/Ramon_Llull">Ramon Llull</a>, Leibniz postulated that the language would be ideographic like Egyptian hieroglyphics, except characters would correspond to “atomic” concepts of math and science. He argued this language would give humankind an “instrument” that could enhance human reason “to a far greater extent than optical instruments” like the microscope and telescope.</p>
<p>He also <a href="http://publicdomainreview.org/2016/11/10/let-us-calculate-leibniz-llull-and-computational-imagination/">imagined</a> a machine that could process the language, which he called the calculus ratiocinator.</p>
<blockquote>
<p>If controversies were to arise, there would be no more need of disputation between two philosophers than between two accountants. For it would suffice to take their pencils in their hands, and say to each other: Calculemus—Let us calculate.</p>
</blockquote>
<p>Leibniz didn’t get the opportunity to develop his universal language or the corresponding machine (although he did invent a relatively simple calculating machine, the <a href="https://en.wikipedia.org/wiki/Stepped_reckoner">stepped reckoner</a>). The first credible attempt to realize Leibniz’s dream came in 1879, when the German philosopher Gottlob Frege published his landmark logic treatise <em><a href="https://en.wikipedia.org/wiki/Begriffsschrift">Begriffsschrift</a></em>. Inspired by Boole’s attempt to improve Aristotle’s logic, Frege developed a much more advanced logical system. The logic taught in philosophy and computer-science classes today—first-order or predicate logic—is only a slight modification of Frege’s system.</p>
<p>Frege is generally considered one of the most important philosophers of the 19th century. Among other things, he is credited with catalyzing what noted philosopher Richard Rorty called the “<a href="https://en.wikipedia.org/wiki/Linguistic_turn">linguistic turn</a>” in philosophy. As Enlightenment philosophy was obsessed with questions of knowledge, philosophy after Frege became obsessed with questions of language. His disciples included two of the most important philosophers of the 20th century—Bertrand Russell and Ludwig Wittgenstein.</p>
<p>The major innovation of Frege’s logic is that it much more accurately represented the logical structure of ordinary language. Among other things, Frege was the first to use quantifiers (“for every,” “there exists”) and to separate objects from predicates. He was also the first to develop what today are fundamental concepts in computer science like recursive functions and variables with scope and binding.</p>
<p>Frege’s formal language — what he called his “concept-script” — is made up of meaningless symbols that are manipulated by well-defined rules. The language is only given meaning by an interpretation, which is specified separately (this distinction would later come to be called syntax versus semantics). This turned logic into what the eminent computer scientists Allan Newell and Herbert Simon called “the symbol game,” “played with meaningless tokens according to certain purely syntactic rules.”</p>
<blockquote>
<p>All meaning had been purged. One had a mechanical system about which various things could be proved. Thus progress was first made by walking away from all that seemed relevant to meaning and human symbols.</p>
</blockquote>
<p>As Bertrand Russell famously quipped: “Mathematics may be defined as the subject in which we never know what we are talking about, nor whether what we are saying is true.”</p>
<p>An unexpected consequence of Frege’s work was the discovery of weaknesses in the foundations of mathematics. For example, Euclid’s <em>Elements</em> — considered the gold standard of logical rigor for thousands of years — turned out to be full of logical mistakes. Because Euclid used ordinary words like “line” and “point,” he — and centuries of readers — deceived themselves into making assumptions about sentences that contained those words. To give one relatively simple example, in ordinary usage, the word “line” implies that if you are given three distinct points on a line, one point must be between the other two. But when you define “line” using formal logic, it turns out “between-ness” also needs to be defined—something Euclid overlooked. Formal logic makes gaps like this easy to spot.</p>
<p>This realization created a <a href="https://en.wikipedia.org/wiki/Foundations_of_mathematics#Foundational_crisis">crisis</a> in the foundation of mathematics. If the <em>Elements</em> — the bible of mathematics — contained logical mistakes, what other fields of mathematics did too? What about sciences like physics that were built on top of mathematics?</p>
<p>The good news is that the same logical methods used to uncover these errors could also be used to correct them. Mathematicians started rebuilding the foundations of mathematics from the bottom up. In 1889, Giuseppe Peano <a href="https://en.wikipedia.org/wiki/Peano_axioms">developed</a> axioms for arithmetic, and in 1899, David Hilbert <a href="https://en.wikipedia.org/wiki/Hilbert%27s_axioms">did</a> the same for geometry. Hilbert also outlined a program to formalize the remainder of mathematics, with specific requirements that any such attempt should satisfy, including:</p>
<ul>
<li><em>Completeness</em>: There should be a proof that all true mathematical statements can be proved in the formal system.</li>
<li><em>Decidability</em>: There should be an algorithm for deciding the truth or falsity of any mathematical statement. (This is the “<em>Entscheidungsproblem</em>” or “decision problem” referenced in Turing’s paper.)</li>
</ul>
<p>Rebuilding mathematics in a way that satisfied these requirements became known as Hilbert’s program. Up through the 1930s, this was the focus of a core group of logicians including Hilbert, Russell, Kurt Gödel, John Von Neumann, Alonzo Church, and, of course, Alan Turing.</p>
<p>Hilbert’s program proceeded on at least two fronts. On the first front, logicians created logical systems that tried to prove Hilbert’s requirements either satisfiable or not.</p>
<p>On the second front, mathematicians used logical concepts to rebuild classical mathematics. For example, Peano’s system for arithmetic starts with a simple function called the successor function which increases any number by one. He uses the successor function to recursively define <a href="https://en.wikipedia.org/wiki/Peano_axioms#Addition">addition</a>, uses addition to recursively define <a href="https://en.wikipedia.org/wiki/Peano_axioms#Multiplication">multiplication</a>, and so on, until all the operations of number theory are defined. He then uses those definitions, along with formal logic, to prove theorems about arithmetic.</p>
<p>The historian Thomas Kuhn once observed that “in science, novelty emerges only with difficulty.” Logic in the era of Hilbert’s program was a tumultuous process of creation and destruction. One logician would build up an elaborate system and another would tear it down.</p>
<p>The favored tool of destruction was the construction of self-referential, paradoxical statements that showed the axioms from which they were derived to be inconsistent. A simple form of this “liar’s paradox” is the sentence:</p>
<p>This sentence is false.</p>
<p>If it is true then it is false, and if it is false then it is true, leading to an endless loop of self-contradiction.</p>
<p>Russell made the first notable use of the liar’s paradox in mathematical logic. He showed that Frege’s system allowed self-contradicting sets to be derived:</p>
<blockquote>
<p>Let <em>R</em> be the set of all sets that are not members of themselves. If <em>R</em> is not a member of itself, then its definition dictates that it must contain itself, and if it contains itself, then it contradicts its own definition as the set of all sets that are not members of themselves.</p>
</blockquote>
<p>This became known as Russell’s paradox and was seen as a serious flaw in Frege’s achievement. (Frege himself was shocked by this discovery. He replied to Russell: “Your discovery of the contradiction caused me the greatest surprise and, I would almost say, consternation, since it has shaken the basis on which I intended to build my arithmetic.”)</p>
<p>Russell and his colleague Alfred North Whitehead put forth the most ambitious attempt to complete Hilbert’s program with the <em>Principia Mathematica</em>, published in three volumes between 1910 and 1913. The <em>Principia’s</em> method was so detailed that it took over 300 pages to get to the proof that 1+1=2.</p>
<p>Russell and Whitehead tried to resolve Frege’s paradox by introducing what they called type theory. The idea was to partition formal languages into multiple levels or types. Each level could make reference to levels below, but not to their own or higher levels. This resolved self-referential paradoxes by, in effect, banning self-reference. (This solution was not popular with logicians, but it did influence computer science — most modern computer languages have features inspired by type theory.)</p>
<p>Self-referential paradoxes ultimately showed that Hilbert’s program could never be successful. The first blow came in 1931, when Gödel published his now famous incompleteness theorem, which proved that any consistent logical system powerful enough to encompass arithmetic must also contain statements that are true but cannot be proven to be true. (Gödel’s incompleteness theorem is one of the few logical results that has been broadly popularized, thanks to books like <a href="https://en.wikipedia.org/wiki/G%C3%B6del,_Escher,_Bach">Gödel, Escher, Bach</a> and <a href="https://www.amazon.com/dp/B00ARGXG7Q/ref=dp-kindle-redirect?_encoding=UTF8&btkr=1">The Emperor’s New Mind</a>).</p>
<p>The final blow came when Turing and Alonzo Church independently proved that no algorithm could exist that determined whether an arbitrary mathematical statement was true or false. (Church did this by inventing an entirely different system called the <a href="https://en.wikipedia.org/wiki/Lambda_calculus">lambda calculus</a>, which would later inspire computer languages like <a href="https://en.wikipedia.org/wiki/Lisp_%28programming_language%29">Lisp</a>.) The answer to the decision problem was negative.</p>
<p>Turing’s key insight came in the first section of his famous 1936 paper, “On Computable Numbers, With an Application to the <em>Entscheidungsproblem</em>.” In order to rigorously formulate the decision problem (the “<em>Entscheidungsproblem</em>”), Turing first created a mathematical model of what it means to be a computer (today, machines that fit this model are known as “universal Turing machines”). As the logician Martin Davis describes it:</p>
<blockquote>
<p>Turing knew that an algorithm is typically specified by a list of rules that a person can follow in a precise mechanical manner, like a recipe in a cookbook. He was able to show that such a person could be limited to a few extremely simple basic actions without changing the final outcome of the computation.</p>
<p>Then, by proving that no machine performing only those basic actions could determine whether or not a given proposed conclusion follows from given premises using Frege’s rules, he was able to conclude that no algorithm for the Entscheidungsproblem exists.</p>
<p>As a byproduct, he found a mathematical model of an all-purpose computing machine.</p>
</blockquote>
<p>Next, Turing showed how a program could be stored inside a computer alongside the data upon which it operates. In today’s vocabulary, we’d say that he invented the “stored-program” architecture that underlies most modern computers:</p>
<blockquote>
<p>Before Turing, the general supposition was that in dealing with such machines the three categories — machine, program, and data — were entirely separate entities. The machine was a physical object; today we would call it hardware. The program was the plan for doing a computation, perhaps embodied in punched cards or connections of cables in a plugboard. Finally, the data was the numerical input. Turing’s universal machine showed that the distinctness of these three categories is an illusion.</p>
</blockquote>
<p>This was the first rigorous demonstration that any computing logic that could be encoded in hardware could also be encoded in software. The architecture Turing described was later dubbed the “Von Neumann architecture” — but modern historians generally agree it came from Turing, as, apparently, did Von Neumann <a href="https://en.wikipedia.org/wiki/Alan_Turing#cite_note-36">himself</a>.</p>
<p>Although, on a technical level, Hilbert’s program was a failure, the efforts along the way demonstrated that large swaths of mathematics could be constructed from logic. And after Shannon and Turing’s insights—showing the connections between electronics, logic and computing—it was now possible to export this new conceptual machinery over to computer design.</p>
<p>During World War II, this theoretical work was put into practice, when government labs conscripted a number of elite logicians. Von Neumann joined the atomic bomb project at Los Alamos, where he worked on computer design to support physics research. In 1945, he wrote the <a href="http://www.virtualtravelog.net/wp/wp-content/media/2003-08-TheFirstDraft.pdf">specification</a> of the EDVAC—the first stored-program, logic-based computer—which is generally considered the definitive source guide for modern computer design.</p>
<p>Turing joined a secret unit at Bletchley Park, northwest of London, where he helped design computers that were instrumental in breaking German codes. His most enduring contribution to practical computer design was his specification of the ACE, or Automatic Computing Engine.</p>
<p>As the first computers to be based on Boolean logic and stored-program architectures, the ACE and the EDVAC were similar in many ways. But they also had interesting differences, some of which foreshadowed modern debates in computer design. Von Neumann’s favored designs were similar to modern CISC (“complex”) processors, baking rich functionality into hardware. Turing’s design was more like modern RISC (“reduced”) processors, minimizing hardware complexity and pushing more work to software.</p>
<p>Von Neumann thought computer programming would be a tedious, clerical job. Turing, by contrast, said computer programming “should be very fascinating. There need be no real danger of it ever becoming a drudge, for any processes that are quite mechanical may be turned over to the machine itself.”</p>
<p>Since the 1940s, computer programming has become significantly more sophisticated. One thing that hasn’t changed is that it still primarily consists of programmers specifying rules for computers to follow. In philosophical terms, we’d say that computer programming has followed in the tradition of deductive logic, the branch of logic discussed above, which deals with the manipulation of symbols according to formal rules.</p>
<p>In the past decade or so, programming has started to change with the growing popularity of machine learning, which involves creating frameworks for machines to learn via statistical inference. This has brought programming closer to the other main branch of logic, inductive logic, which deals with inferring rules from specific instances.</p>
<p>Today’s most promising machine learning techniques use neural networks, which were first <a href="http://www.cse.chalmers.se/~coquand/AUTOMATA/mcp.pdf">invented</a> in 1940s by Warren McCulloch and Walter Pitts, whose idea was to develop a calculus for neurons that could, like Boolean logic, be used to construct computer circuits. Neural networks remained esoteric until decades later when they were combined with statistical techniques, which allowed them to improve as they were fed more data. Recently, as computers have become increasingly adept at handling large data sets, these techniques have produced remarkable results. Programming in the future will likely mean exposing neural networks to the world and letting them learn.</p>
<p>This would be a fitting second act to the story of computers. Logic began as a way to understand the laws of thought. It then helped create machines that could reason according to the rules of deductive logic. Today, deductive and inductive logic are being combined to create machines that both reason and learn. What began, in Boole’s words, with an investigation “concerning the nature and constitution of the human mind,” could result in the creation of new minds—artificial minds—that might someday match or even exceed our own.</p>
]]></content:encoded>
</item>
<item>
<title>Gadgets and Computers</title>
<link>https://cdixon.org/2017/01/16/gadgets-and-computers/</link>
<guid>https://cdixon.org/2017/01/16/gadgets-and-computers/</guid>
<pubDate>Mon, 16 Jan 2017 00:00:00 GMT</pubDate>
<description>From Benedict Evans’ Cars as Feature Phones: This is a common theme in many classes of device: you start with a product that has a few electronic functions added, and ...</description>
<content:encoded><![CDATA[<p>From Benedict Evans’ <a href="http://ben-evans.com/benedictevans/2017/01/10/cars-as-featurephones">Cars as Feature Phones</a>:</p>
<blockquote>
<p>This is a common theme in many classes of device: you start with a product that has a few electronic functions added, and then those functions are delivered with chips, and perhaps they gain an interface and then a screen, and more and more functions (and probably multi-function buttons) — and then, somehow, you’ve built a little weird custom computer without actually meaning to, and all the little silos of features and functions become unmanageable, both at an interface level and also at a fundamental engineering level, and the whole thing gets replaced by a real computer with a real software platform. And this new computer is almost certainly made by a different company.
You could see this problem very clearly at Motorola, which developed as many as two dozen ‘operating systems’ — for phones, pagers, satellite phones, car-control, industrial devices, chip evaluation boards and so on and so on, and picked them for each device out of a metaphorical parts bin just as you’d choose a sensor or battery or any other component. And boy, they really knew how to write operating systems — they had dozens! With, probably, ‘<a href="https://www.technologyreview.com/s/508231/many-cars-have-a-hundred-million-lines-of-code/">millions of lines of code</a>’. This was exactly the right approach in 1995, but in 2005, again, the whole thing collapsed under its own weight, because they needed software as a platform rather than as a one-off component, and instead <a href="http://www.theregister.co.uk/Print/2012/11/29/rockman_on_motorola/">they had a mess</a>.</p>
</blockquote>
<p>The iPhone was the first mainstream cell phone that was also a proper computer. It had a full-fledged operating system and a (mostly) open developer platform. We are likely seeing the same pattern play out across the <a href="https://medium.com/software-is-eating-the-world/what-s-next-in-computing-e54b870b80cc#.bmdmkoc13">next generation of computers</a>: not only cars, but drones, IoT devices, wearables, etc. In the beginning, hardware-focused companies make gadgets with ever increasing laundry lists of features. Then a company with strong software expertise (often a new market entrant) comes along that replaces these feature-packed gadgets with full-fledged computers. These computers have proper (usually Unix-like) operating systems, open developer platforms, and streamlined user interfaces (increasingly, powered by AI).</p>
<p>This process takes time to play out. Apple waited more than a decade from the initial popularity of cell phones to the release of the first iPhone. And sometimes you don’t know the significance of a new computing device until many years later. It wasn’t obvious until around 2012 that iOS and Android smartphones would become the dominant form of computing (recall Facebook’s “<a href="https://techcrunch.com/2012/10/19/facebook-mobile-first/">pivot to mobile</a>” in 2012). Some people (including me) believe we’ve already entered the “computer phase” of consumer IoT with voice assistants like Alexa, but it will probably take years before we understand the enduring mainstream appeal of these devices.</p>
]]></content:encoded>
</item>
<item>
<title>As [Edwin] Land ultimately recognized, the adoption of his [polarized headlight] system was fatally…</title>
<link>https://cdixon.org/2016/09/25/as-edwin-land-ultimately-recognized-the-adoption-of-his-polarized-headlight-system-was-fatally/</link>
<guid>https://cdixon.org/2016/09/25/as-edwin-land-ultimately-recognized-the-adoption-of-his-polarized-headlight-system-was-fatally/</guid>
<pubDate>Sun, 25 Sep 2016 00:00:00 GMT</pubDate>
<description>As [Edwin] Land ultimately recognized, the adoption of his [polarized headlight] system was fatally hampered by the fact that there was no competitive advantage for any car company in using ...</description>
<content:encoded><![CDATA[<p>As [Edwin] Land ultimately recognized, the adoption of his [polarized headlight] system was fatally hampered by the fact that there was no competitive advantage for any car company in using it first. Since all cars needed to incorporate the technology as simultaneously as possible, it was either going to be all, either voluntarily or as directed by the government, or none. No state or federal governmental agency ever stepped in to direct the adoption of the technology in the way that seat belts would be required decades later. Herbert Nichols, a journalist with the Christian Science Monitor who had followed the story, believed that the industry killed the idea even though the demonstrations clearly showed that the system worked. According to Nichols, the industry concluded that it “just didn’t need anything to sell automobiles. They realized they could sell all the automobiles they could make.” Thus, with no economic or competitive incentive, why bother with a system that clearly added costs and admittedly presented implementation issues? After more than two decades, Land reluctantly gave up the fight.</p>
<p><strong>But he learned one very important lesson. “I knew then that I would never go into a commercial field that put a barrier between us and the customer.” Rather than deal with other companies as intermediaries, he would market his innovative products directly to the public. He believed “that the role of industry is to sense a deep human need, then bring science and technology to bear on filling that need. Any market already existing is inherently boring and dull.” Land, like Steve Jobs many decades later, believed that his company should “give people products they do not even know they want.” Fortunately, he already had such a product in mind.</strong></p>
<p>— <em><a href="https://www.amazon.com/dp/B00OHRYYFO/">A Triumph of Genius: Edwin Land, Polaroid, and the Kodak Patent War</a></em></p>
]]></content:encoded>
</item>
<item>
<title>Eleven Reasons To Be Excited About The Future of Technology</title>
<link>https://cdixon.org/2016/08/18/eleven-reasons-to-be-excited-about-the-future-of-technology/</link>
<guid>https://cdixon.org/2016/08/18/eleven-reasons-to-be-excited-about-the-future-of-technology/</guid>
<pubDate>Thu, 18 Aug 2016 00:00:00 GMT</pubDate>
<description>“The strongest force propelling human progress has been the swift advance and wide diffusion of technology.” — The Economist In the year 1820, a person could expect to live less ...</description>
<content:encoded><![CDATA[<blockquote>
<p>“The strongest force propelling human progress has been the swift advance and wide diffusion of technology.” — <a href="http://www.economist.com/node/841842">The Economist</a></p>
</blockquote>
<p>In the year 1820, a person could <a href="https://ourworldindata.org/life-expectancy/">expect to live</a> less than 35 years, 94% of the global population <a href="https://ourworldindata.org/world-poverty/">lived in extreme poverty</a>, and less that 20% of the population was literate. Today, human life expectancy is over 70 years, less that 10% of the global population lives in extreme poverty, and <a href="http://www.oecd.org/statistics/How-was-life.pdf">over 80% of people</a> are literate. These improvements are due mainly to advances in technology, beginning in the industrial age and continuing today in the information age.</p>
<p>There are many exciting new technologies that will continue to transform the world and improve human welfare. Here are eleven of them.</p>
<h2>1. Self-Driving Cars</h2>
<p>Self-driving cars exist today that are safer than human-driven cars in most driving conditions. Over the next 3–5 years they‘ll get even safer, and will begin to go mainstream.</p>
<p><img src="images/1_HfoJs9tCyyr6VeLvD45wyQ.gif" alt=""></p>
<p>The <a href="http://www.who.int/mediacentre/factsheets/fs358/en/">World Health Organization estimates</a> that 1.25 million people die from car-related injuries per year. Half of the deaths are pedestrians, bicyclists, and motorcyclists hit by cars. Cars are the leading cause of death for people ages 15–29 years old.</p>
<p><img src="images/1_SNGdeK4GNUhjL6wlh7sfJw.png" alt=""></p>
<p>Just as cars reshaped the world in the 20th century, so will self-driving cars in the 21st century. In most cities, <a href="http://oldurbanist.blogspot.com.es/2011/12/we-are-25-looking-at-street-area.html">between 20–30%</a> of usable space is taken up by parking spaces, and most cars are parked <a href="http://www.reinventingparking.org/2013/02/cars-are-parked-95-of-time-lets-check.html">about 95%</a> of the time. Self-driving cars will be in almost continuous use (most likely hailed from a smartphone app), thereby dramatically reducing the need for parking. Cars will communicate with one another to avoid accidents and traffic jams, and riders will be able to spend commuting time on other activities like work, education, and socializing.</p>
<p><img src="images/1_k6w2wkkREpVeu9_cS2xxtg.png" alt="Source: Tech Insider"></p>
<h2>2. Clean Energy</h2>
<p>Attempts to fight climate change by reducing the demand for energy <a href="https://en.wikipedia.org/wiki/World_energy_consumption">haven’t worked</a>. Fortunately, scientists, engineers, and entrepreneurs have been working hard on the supply side to make clean energy convenient and cost-effective.</p>
<p>Due to steady technological and manufacturing advances, the price of solar cells has <a href="http://www.saskwind.ca/wind-cost-decline/">dropped 99.5% since 1977</a>. Solar will soon be more cost efficient than fossil fuels. The cost of wind energy has also dropped to an all-time low, and in the last decade represented about a <a href="http://energy.gov/articles/top-10-things-you-didnt-know-about-wind-power">third of newly installed</a> US energy capacity.</p>
<p>Forward thinking organizations are taking advantage of this. For example, in India there is an initiative to convert airports to self-sustaining clean energy.</p>
<p><img src="images/1_idAW1ONI_iIeevzPaUv-pg.png" alt="Airport in Kochi, India (source: Clean Technica)"></p>
<p>Tesla is making high-performance, affordable electric cars, and <a href="http://www.treehugger.com/cars/tesla-built-858-new-charging-stations-us-over-past-12-months.html">installing</a> electric charging stations <a href="http://mashable.com/2016/04/01/tesla-supercharger-expansion/#v93tzyDFl5qR">worldwide</a>.</p>
<p><img src="images/1_YwcTRiWETVn4aXiZhEJtcg.png" alt="Tesla Model 3 and US supercharger locations"></p>
<p>There are hopeful signs that clean energy could soon be reaching a tipping point. For example, in Japan, there are now more electric charging stations than gas stations.</p>
<p><img src="images/1_RNmY6abYWA2n2W6EgP3lcA.png" alt="Source: The Guardian"></p>
<p>And Germany produces so much renewable energy, it sometimes produces even more than it can use.</p>
<p><img src="images/1_wETYiSDThJ5fQYIVWuw8aA.png" alt="Source: Time Magazine"></p>
<h2>3. Virtual and Augmented Reality</h2>
<p>Computer processors only recently became fast enough to power comfortable and convincing virtual and augmented reality experiences. Companies like Facebook, Google, Apple, and Microsoft are investing billions of dollars to make VR and AR more immersive, comfortable, and affordable.</p>
<p><img src="images/1_6cmd8P-bPYRU1olrJHsvfw.gif" alt="Toybox demo from Oculus"></p>
<p>People sometimes think VR and AR will be used only for gaming, but over time they will be used for all sorts of activities. For example, we’ll use them to manipulate 3-D objects:</p>
<p><img src="images/1_q_pqQCTcTETf4G-ARUm00A.jpeg" alt="Augmented reality computer interface (from Iron Man)"></p>
<p>To meet with friends and colleagues from around the world:</p>
<p><img src="images/1_MJcHcqCWEzGxDIVDGpcHcA.jpeg" alt="Augmented reality teleconference (from The Kingsman)"></p>
<p>And even for medical applications, like treating phobias or helping rehabilitate paralysis victims:</p>
<p><img src="images/1_q_J7Ql2iVfdDYc5t6hM98Q.png" alt="Source: New Scientist"></p>
<p>VR and AR have been dreamed about by science fiction fans for decades. In the next few years, they’ll finally become a mainstream reality.</p>
<h2>4. Drones and Flying Cars</h2>
<blockquote>
<p>“Roads? Where we’re going we don’t need… roads.” — Dr. Emmet Brown</p>
</blockquote>
<p>GPS started out as a military technology but is now used to hail taxis, get mapping directions, and hunt Pokémon. Likewise, drones started out as a military technology, but are increasingly being used for a wide range of consumer and commercial applications.</p>
<p>For example, drones are being used to inspect critical infrastructure like bridges and power lines, to survey areas struck by natural disasters, and many other creative uses like fighting animal poaching.</p>
<p><img src="images/1_hLhAdWXECMyNLwrHfad6pA.png" alt="Source: NBC News"></p>
<p>Amazon and Google are building drones to deliver household items.</p>
<p><img src="images/1_s1eQciCtoaD_AaovzJouAA.gif" alt="Amazon delivery drone"></p>
<p>The startup <a href="http://flyzipline.com/product/">Zipline</a> uses drones to deliver medical supplies to remote villages that can’t be accessed by roads.</p>
<p><img src="images/1_BDepNtZOTWXNOi5F4Dk3Dg.png" alt="Source: The Verge"></p>
<p>There is also a new wave of startups working on flying cars (including <a href="http://www.bloomberg.com/news/articles/2016-06-09/welcome-to-larry-page-s-secret-flying-car-factories">two</a> funded by the cofounder of Google, Larry Page).</p>
<p><img src="images/1_FJyVIp3MI_k7mVM5obpSsA.png" alt="The Terrafugia TF-X flying car (source)"></p>
<p>Flying cars use the same advanced technology used in drones but are large enough to carry people. Due to advances in materials, batteries, and software, flying cars will be significantly more affordable and convenient than today’s planes and helicopters.</p>
<h2>5. Artificial Intelligence</h2>
<p><img src="images/1_I2dRn7D8ZZM7nI2IvvMFDw.jpeg" alt=""></p>
<blockquote>
<p>‘’It may be a hundred years before a computer beats humans at Go — maybe even longer.” — <a href="http://www.nytimes.com/1997/07/29/science/to-test-a-powerful-computer-play-an-ancient-game.html?pagewanted=all">New York Times, 1997</a></p>
<p>“Master of Go Board Game Is Walloped by Google Computer Program” —<a href="http://www.nytimes.com/2016/03/10/world/asia/google-alphago-lee-se-dol.html"> New York Times, 2016</a></p>
</blockquote>
<p>Artificial intelligence has made rapid advances in the last decade, due to new algorithms and massive increases in data collection and computing power.</p>
<p>AI can be applied to almost any field. For example, in photography an AI technique called artistic style transfer transforms photographs into the style of a given painter:</p>
<p><img src="images/1_aHFJuj-jhnP4zHY1dD7tRA.png" alt="Source"></p>
<p>Google built an AI system that controls its datacenter power systems, saving hundreds of millions of dollars in energy costs.</p>
<p><img src="images/1_HpTNGOsV1a0PpqjQZNXKEQ.png" alt="Source: Bloomberg"></p>
<p>The broad promise of AI is to liberate people from repetitive mental tasks the same way the industrial revolution liberated people from repetitive physical tasks.</p>
<blockquote>
<p>“If AI can help humans become better chess players, it stands to reason that it can help us become better pilots, better doctors, better judges, better teachers.” — <a href="http://www.wired.com/2014/10/future-of-artificial-intelligence/">Kevin Kelly</a></p>
</blockquote>
<p>Some people worry that AI will destroy jobs. History has shown that while new technology does indeed eliminate jobs, it also creates new and better jobs to replace them. For example, with advent of the personal computer, the number of typographer jobs dropped, but the increase in graphic designer jobs more than made up for it.</p>
<p><img src="images/1_c_lt2s5TuSoOfmPb_Rv46w.png" alt="Source: Harvard Business Review"></p>
<p>It is much easier to imagine jobs that will go away than new jobs that will be created. Today millions of people work as app developers, ride-sharing drivers, drone operators, and social media marketers— jobs that didn’t exist and would have been difficult to even imagine ten years ago.</p>
<h2>6. Pocket Supercomputers for Everyone</h2>
<p><img src="images/1_5tt6F_Cxnf5n7J5v6Lx0Ug.png" alt=""></p>
<p>By 2020, 80% of adults on earth <a href="">will have</a> an internet-connected smartphone. An iPhone 6 has about 2 billion transistors, roughly 625 times more transistors than a 1995 Intel Pentium computer. Today’s smartphones are what used to be considered supercomputers.</p>
<p><img src="images/1_vovBLv3ePKce3dPrU3q9Lg.png" alt="Visitors to the pope (source: Business Insider)"></p>
<p>Internet-connected smartphones give ordinary people abilities that, just a short time ago, were only available to an elite few:</p>
<blockquote>
<p>“Right now, a Masai warrior on a mobile phone in the middle of Kenya has better mobile communications than the president did 25 years ago. If he’s on a smart phone using Google, he has access to more information than the U.S. president did just 15 years ago.” — <a href="http://edition.cnn.com/2012/05/06/opinion/diamandis-abundance-innovation/">Peter Diamandis</a></p>
</blockquote>
<h2>7. Cryptocurrencies and Blockchains</h2>
<blockquote>
<p>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are linked using hypertext.” — <a href="http://farmerandfarmer.org/mastery/builder.html">Farmer & Farmer</a></p>
</blockquote>
<p>Protocols are the plumbing of the internet. Most of the protocols we use today were developed decades ago by academia and government. Since then, protocol development mostly stopped as energy shifted to developing proprietary systems like social networks and messaging apps.</p>
<p>Cryptocurrency and blockchain technologies are <a href="http://avc.com/2016/07/the-golden-age-of-open-protocols/">changing this</a> by providing a new business model for internet protocols. This year alone, <a href="https://medium.com/the-coinbase-blog/app-coins-and-the-dawn-of-the-decentralized-business-model-8b8c951e734f#.2atvp1cxd">hundreds of millions of dollars</a> were raised for a broad range of innovative blockchain-based protocols.</p>
<p>Protocols based on blockchains also have capabilities that previous protocols didn’t. For example, <a href="https://en.wikipedia.org/wiki/Ethereum">Ethereum</a> is a new blockchain-based protocol that can be used to create smart contracts and trusted databases that are immune to corruption and censorship.</p>
<h2>8. High-Quality Online Education</h2>
<p>While college tuition <a href="http://www.cnbc.com/2015/06/16/why-college-costs-are-so-high-and-rising.html">skyrockets</a>, anyone with a smartphone can study almost any topic online, accessing educational content that is mostly free and increasingly high-quality.</p>
<p>Encyclopedia Britannica <a href="http://www.csmonitor.com/Business/Latest-News-Wires/2012/0314/Encyclopaedia-Britannica-After-244-years-in-print-only-digital-copies-sold">used to cost $1,400</a>. Now anyone with a smartphone can instantly access Wikipedia. You used to have to go to school or buy programming books to learn computer programming. Now you can learn from a community of over 40 million programmers at <a href="http://stackoverflow.com">Stack Overflow</a>. YouTube has millions of hours of free tutorials and lectures, many of which are produced by top professors and universities.</p>
<p><img src="images/1_NZTqnqYbOPv6sf7gCVLz8g.png" alt="UC Berkeley Physics on Youtube"></p>
<p>The quality of online education is getting better all the time. For the last 15 years, <a href="http://ocw.mit.edu/index.htm">MIT has been recording lectures</a> and compiling materials that cover over 2000 courses.</p>
<blockquote>
<p>“The idea is simple: to publish all of our course materials online and make them widely available to everyone.” — Dick K.P. Yue, Professor, MIT School of Engineering</p>
</blockquote>
<p>As perhaps the greatest research university in the world, MIT has always been ahead of the trends. Over the next decade, expect many other schools to follow MIT’s lead.</p>
<p><img src="images/1_W-i0QTotXS-K4MU9qbpylQ.png" alt="Source: Futurism"></p>
<h2>9. Better Food through Science</h2>
<p><img src="images/1_O5VQyJRhI2-sHYzZPrHSBQ.png" alt="Source: National Geographic"></p>
<p>Earth is running out of farmable land and fresh water. This is partly because our food production systems are incredibly inefficient. It takes an astounding 1799 gallons of water to produce 1 pound of beef.</p>
<p>Fortunately, a variety of new technologies are being developed to improve our food system.</p>
<p>For example, entrepreneurs are developing new food products that are tasty and nutritious substitutes for traditional foods but far more environmentally friendly. The startup <a href="http://www.impossiblefoods.com/">Impossible Foods</a> invented meat products that look and taste like the real thing but are actually made of plants.</p>
<p><img src="images/1_bUV4b3Xp0mvvdA8dp1hMtA.png" alt="Impossible Food’s plant-based burger (source: Tech Insider)"></p>
<p>Their burger <a href="http://www.impossiblefoods.com/our-burger">uses</a> 95% less land, 74% less water, and produces 87% less greenhouse gas emissions than traditional burgers. Other startups are creating plant-based replacements for <a href="http://ripplefoods.com/">milk</a>, <a href="https://www.hamptoncreek.com/">eggs</a>, and other common foods. <a href="http://soylent.com/">Soylent</a> is a healthy, inexpensive meal replacement that uses advanced engineered <a href="http://terravia.com/Terravia_Sustainability.pdf">ingredients</a> that are much friendlier to the environment than traditional ingredients.</p>
<p>Some of these products are developed using genetic modification, a powerful scientific technique that has been widely mischaracterized as dangerous. According to a <a href="https://www.geneticliteracyproject.org/2015/01/29/pewaaas-study-scientific-consensus-on-gmo-safety-stronger-than-for-global-warming/">study</a> by the Pew Organization, 88% of scientists think genetically modified foods are safe.</p>
<p>Another exciting development in food production is automated indoor farming. Due to advances in solar energy, sensors, lighting, robotics, and artificial intelligence, indoor farms have become viable alternatives to traditional outdoor farms.</p>
<p><img src="images/1_0Jyjlgj1KU2yfBqo7quCLQ.png" alt="Aerofarms indoor farm (Source: New York Times)"></p>
<p>Compared to traditional farms, automated indoor farms use roughly 10 times less water and land. Crops are harvested many more times per year, there is no dependency on weather, and no need to use pesticides.</p>
<h2>10. Computerized Medicine</h2>
<p>Until recently, computers have only been at the periphery of medicine, used primarily for research and record keeping. Today, the combination of computer science and medicine is leading to a variety of breakthroughs.</p>
<p><img src="images/1_IjKrWZdlbB2ksis_Dmia5A.png" alt=""></p>
<p>For example, just fifteen years ago, it cost $3B to sequence a human genome. Today, the cost is about a thousand dollars and continues to drop. Genetic sequencing will soon be a routine part of medicine.</p>
<p>Genetic sequencing generates massive amounts of data that can be analyzed using powerful data analysis software. One application is analyzing <a href="http://a16z.com/2016/06/09/freenome/">blood samples</a> for early detection of cancer. Further genetic analysis can help determine the <a href="http://www.businessinsider.com/super-cheap-genome-sequencing-by-2020-2014-10">best course</a> of treatment.</p>
<p>Another application of computers to medicine is in prosthetic limbs. Here a young girl is using prosthetic hands she controls using her upper-arm muscles:</p>
<p><img src="images/1_jVH1wxchOJ5qJzT46s907A.gif" alt="Source: Open Bionics"></p>
<p>Soon we’ll have the technology to control prothetic limbs with just our thoughts using <a href="http://news.uci.edu/feature/to-walk-again/">brain-to-machine interfaces</a>.</p>
<p>Computers are also becoming increasingly effective at diagnosing diseases. An artificial intelligence system recently diagnosed a rare disease that human doctors failed to diagnose by finding hidden patterns in 20 million cancer records.</p>
<p><img src="images/1_OEgWlj9sp2mCV0PrT9yp8A.png" alt="Source: International Business Times"></p>
<h2>11. A New Space Age</h2>
<p>Since the beginning of the space age in the 1950s, the vast majority of space funding has come from governments. But that funding has been in decline: for example, NASA’s budget <a href="https://en.wikipedia.org/wiki/Budget_of_NASA">dropped</a> from about 4.5% of the federal budget in the 1960s to about 0.5% of the federal budget today.</p>
<p><img src="images/1_paniidrx59zPQjq_q6rUHA.png" alt="Source: Fortune"></p>
<p>The good news is that private space companies have started filling the void. These companies provide a wide range of products and services, including rocket launches, scientific research, communications and imaging satellites, and emerging speculative business models like asteroid mining.</p>
<p>The most famous private space company is Elon Musk’s SpaceX, which successfully sent rockets into space that can return home to be reused.</p>
<p><img src="images/1_5iiaQsTBu1tQ_hTy8fupXg.gif" alt="SpaceX Falcon 9 landing"></p>
<p>Perhaps the most intriguing private space company is <a href="http://www.planetaryresources.com/">Planetary Resources</a>, which is trying to pioneer a new industry: mining minerals from asteroids.</p>
<p><img src="images/1_6zvea6z14lJ6inZQsVBsBA.png" alt="Asteroid mining"></p>
<p>If successful, asteroid mining could lead to a new gold rush in outer space. Like previous gold rushes, this could lead to speculative excess, but also dramatically increased funding for new technologies and infrastructure.</p>
<hr>
<p>These are just a few of the amazing technologies we’ll see developed in the coming decades. 2016 is just the beginning of a new age of wonders. As futurist Kevin Kelly <a href="https://www.linkedin.com/pulse/internet-still-beginning-its-kevin-kelly">says</a>:</p>
<blockquote>
<p>If we could climb into a time machine, journey 30 years into the future, and from that vantage look back to today, we’d realize that most of the greatest products running the lives of citizens in 2050 were not invented until after 2016. People in the future will look at their holodecks and wearable virtual reality contact lenses and downloadable avatars and AI interfaces and say, “Oh, you didn’t really have the internet” — or whatever they’ll call it — “back then.”</p>
<p>So, the truth: Right now, today, in 2016 is the best time to start up. There has never been a better day in the whole history of the world to invent something. There has never been a better time with more opportunities, more openings, lower barriers, higher benefit/ risk ratios, better returns, greater upside than now. Right now, this minute. This is the moment that folks in the future will look back at and say, “Oh, to have been alive and well back then!”</p>
</blockquote>
]]></content:encoded>
</item>
<item>
<title>“Steve Jobs supposedly said, returning to Apple, that his plan was to stay alive and grab onto the…</title>
<link>https://cdixon.org/2016/08/07/steve-jobs-supposedly-said-returning-to-apple-that-his-plan-was-to-stay-alive-and-grab-onto-the/</link>
<guid>https://cdixon.org/2016/08/07/steve-jobs-supposedly-said-returning-to-apple-that-his-plan-was-to-stay-alive-and-grab-onto-the/</guid>
<pubDate>Sun, 07 Aug 2016 00:00:00 GMT</pubDate>
<description>“Steve Jobs supposedly said, returning to Apple, that his plan was to stay alive and grab onto the next big thing — to listen for the footsteps. He tried video, ...</description>
<content:encoded><![CDATA[<p>“Steve Jobs supposedly said, returning to Apple, that his plan was to stay alive and grab onto the next big thing — to listen for the footsteps. He tried video, and a few other things, but he got there in the end. But he might not have.”</p>
<p>From: <a href="http://ben-evans.com/benedictevans/2016/5/2/inevitability-in-technology">Inevitability in technology</a></p>
]]></content:encoded>
</item>
<item>
<title>“Ether is a necessary element — a fuel — for operating the distributed application platform…</title>
<link>https://cdixon.org/2016/08/07/source-ethereum-org/</link>
<guid>https://cdixon.org/2016/08/07/source-ethereum-org/</guid>
<pubDate>Sun, 07 Aug 2016 00:00:00 GMT</pubDate>
<description>“Ether is a necessary element — a fuel — for operating the distributed application platform Ethereum. It is a form of payment made by the clients of the platform to ...</description>
<content:encoded><![CDATA[<p>“Ether is a necessary element — a fuel — for operating the distributed application platform Ethereum. It is a form of payment made by the clients of the platform to the machines executing the requested operations. To put it another way, ether is the incentive ensuring that developers write quality applications (wasteful code costs more), and that the network remains healthy (people are compensated for their contributed resources).</p>
<p>Ether is to be treated as “crypto-fuel”, a token whose purpose is to pay for computation, and is not intended to be used as or considered a currency, asset, share or anything else.”</p>
<p><em>Source: <a href="https://ethereum.org/ether">ethereum.org</a></em></p>
]]></content:encoded>
</item>
<item>
<title>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they…</title>
<link>https://cdixon.org/2016/07/30/if-you-asked-people-in-1989-what-they-needed-to-make-their-life-better-it-was-unlikely-that-they/</link>
<guid>https://cdixon.org/2016/07/30/if-you-asked-people-in-1989-what-they-needed-to-make-their-life-better-it-was-unlikely-that-they/</guid>
<pubDate>Sat, 30 Jul 2016 00:00:00 GMT</pubDate>
<description>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are ...</description>
<content:encoded><![CDATA[<p>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are linked using hypertext.”</p>
<p>— <a href="http://farmerandfarmer.org/mastery/builder.html">Farmer & Farmer
</a></p>
]]></content:encoded>
</item>
<item>
<title>“The typical path of how people respond to life-changing inventions</title>
<link>https://cdixon.org/2016/05/11/the-typical-path-of-how-people-respond-to-life-changing-inventions/</link>
<guid>https://cdixon.org/2016/05/11/the-typical-path-of-how-people-respond-to-life-changing-inventions/</guid>
<pubDate>Wed, 11 May 2016 00:00:00 GMT</pubDate>
<description>I’ve never heard of it. I’ve heard of it but don’t understand it. I understand it, but I don’t see how it’s useful. I see how it could be fun ...</description>
<content:encoded><![CDATA[<ol>
<li>
<p>I’ve never heard of it.</p>
</li>
<li>
<p>I’ve heard of it but don’t understand it.</p>
</li>
<li>
<p>I understand it, but I don’t see how it’s useful.</p>
</li>
<li>
<p>I see how it could be fun for rich people, but not me.</p>
</li>
<li>
<p>I use it, but it’s just a toy.</p>
</li>
<li>
<p>It’s becoming more useful for me.</p>
</li>
<li>
<p>I use it all the time.</p>
</li>
<li>
<p>I could not imagine life without it.</p>
</li>
<li>
<p>Seriously, people lived without it?</p>
</li>
<li>
<p>It’s too powerful and needs to be regulated”</p>
</li>
</ol>
<p><em>Credits:</em></p>
<p><em>#1–#9 by <a href="http://time.com/author/morgan-housel-the-motley-fool/">Morgan Housel</a>, <a href="http://time.com/money/3940273/innovation-isnt-dead/">Time</a></em></p>
<p><em>#10 by <a href="https://twitter.com/peterpeirce/status/616664561068994560?lang=en">@peterpeirce</a></em></p>
]]></content:encoded>
</item>
<item>
<title>Comma.ai</title>
<link>https://cdixon.org/2016/04/02/comma-ai/</link>
<guid>https://cdixon.org/2016/04/02/comma-ai/</guid>
<pubDate>Sat, 02 Apr 2016 00:00:00 GMT</pubDate>
<description>I wrote a blog post last month highlighting some of the exciting trends in the computing industry. One trend I discussed is the rapid progress in a branch of artificial ...</description>
<content:encoded><![CDATA[<p>I wrote a blog post last month highlighting some of the exciting trends in the computing industry. One trend I discussed is the rapid progress in a branch of artificial intelligence called deep learning. Big tech companies are making significant investments in deep learning, but there are also opportunities for startups:</p>
<blockquote>
<p>Many of the papers, <a href="https://code.google.com/archive/p/word2vec/">data</a> <a href="http://image-net.org/download-images">sets</a>, and <a href="https://www.tensorflow.org/">software</a> <a href="http://deeplearning.net/software/theano/">tools</a> related to deep learning have been open sourced. This has had a democratizing effect, allowing individuals and small organizations to build powerful applications. WhatsApp was able to build a global messaging system that <a href="http://www.wired.com/2015/09/whatsapp-serves-900-million-users-50-engineers/">served 900M users with just 50 engineers</a>, compared to the thousands of engineers that were needed for prior generations of messaging systems. This “<a href="https://twitter.com/cdixon/status/473221599189954562">WhatsApp effect</a>” is now happening in AI. Software tools like <a href="http://deeplearning.net/software/theano/">Theano</a> and <a href="https://www.tensorflow.org/">TensorFlow</a>, combined with cloud data centers for training, and inexpensive GPUs for deployment, allow small teams of engineers to build state-of-the-art AI systems.</p>
</blockquote>
<p>You might have seen <a href="http://www.bloomberg.com/features/2015-george-hotz-self-driving-car/">recent press</a> coverage of a software developer named George Hotz who built his own self-driving car.</p>
<p><img src="images/1U00Hr0kDEBcGUf87W4iPcQ.png" alt=""></p>
<p>I first met George a few months ago, and, like a lot of people who had seen the press coverage, I was skeptical. How could someone build such an advanced system all by himself? After spending time with George, my skepticism turned into enthusiasm. I tested his car, and, along with some of my colleagues and friends with AI expertise, dug into the details of the deep learning system he’d developed.</p>
<p><img src="images/1xJP7l8qL4IbNyJnwHYNwdA.gif" alt="Comma’s self-driving car"></p>
<p>I came away convinced that George’s system is a textbook example of the “WhatsApp effect” happening to AI.</p>
<p><img src="images/1d9qMneOOvDP2WHCxgakQkw.png" alt="George with test car #1"></p>
<p>George is certainly brilliant (he’s a <a href="https://en.wikipedia.org/wiki/George_Hotz">famous hacker</a> for a reason), and he’s no longer alone: he’s now working with a small team of machine learning experts. But he’s also riding a wave of exponentially improving hardware, software, and, most importantly, data. The more his system gets used, the more data it collects, and the smarter it becomes.</p>
<p>Today we are announcing that <a href="http://a16z.com/">a16z</a> is leading a $3.1M investment in George’s company, <a href="http://comma.ai/">Comma.ai</a>. This investment will help them continue to build their team (they’re <a href="http://comma.ai/hiring.html">hiring</a>), and bring their technology to market. Expect more announcements from Comma in the next few months. We are very excited to support George and his team on this ambitious project.</p>
]]></content:encoded>
</item>
<item>
<title>The Internet Economy</title>
<link>https://cdixon.org/2016/03/13/the-internet-economy/</link>
<guid>https://cdixon.org/2016/03/13/the-internet-economy/</guid>
<pubDate>Sun, 13 Mar 2016 00:00:00 GMT</pubDate>
<description>We are living in an era of bundling. The big five consumer tech companies — Google, Apple, Facebook, Amazon, and Microsoft — have moved far beyond their original product lines ...</description>
<content:encoded><![CDATA[<p>We are living in an era of bundling. The big five consumer tech companies — Google, Apple, Facebook, Amazon, and Microsoft — have moved far beyond their original product lines into all sorts of hardware, software, and services that overlap and compete with one another. But their revenues and profits still depend heavily on external technologies that are outside of their control. One way to visualize these external dependencies is to consider the path of a typical internet session, from the user to some revenue-generating action, and then (in some cases) back again to the user:</p>
<p><img src="images/1bUnzLePRb7E25uoUEMYQgA.png" alt=""></p>
<p>When evaluating an internet company’s strategic position (the defensibility of its profit <a href="http://www.investopedia.com/terms/e/economicmoat.asp">moat</a>), you need to consider: 1) how the company generates revenue and profits, 2) the loop in its entirety, not just the layers in which the company has products.</p>
<p>For example, it might seem counterintuitive that Amazon is a <a href="/2010/05/22/while-google-fights-on-the-edges-amazon-is-attacking-their-core/">major threat</a> to Google’s core search business. But you can see this by following the money through the loop: a <a href="http://www.wordstream.com/articles/google-earnings">significant portion</a> of Google’s revenue comes from search queries for things that can be bought on Amazon, and the buying experience on Amazon (from initial purchasing intent to consumption/unboxing) is significantly better than the buying experience on most non-Amazon e-commerce sites you find via Google searches. After a while, shoppers learn to skip Google and go straight to Amazon.</p>
<p>Think of the internet economic loop as a model train track. Positions in front of you can redirect traffic around you. Positions after you can build new tracks that bypass you. New technologies come along (which often look <a href="/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy/">toy-like</a> and unthreatening at first) that create entirely new tracks that render the previous tracks obsolete.</p>
<p>There are interesting developments happening at each layer of the loop (and there are many smaller, offshoot loops not depicted in the chart above), but at any given time certain layers are industry flash points. The most prominent recent battle was between mobile devices and operating systems. That battle seems to be over, with Android software and iOS devices having won. Possible future flash points include:</p>
<p><strong>The automation of logistics.</strong> Today’s logistics network is a patchwork of ships, planes, trucks, warehouses, and people. Tomorrow’s network will include significantly more automation, from robotic warehouses to autonomous cars, trucks, drones, and <a href="http://fortune.com/2016/04/06/dispatch-carry-delivery-robot/">delivery bots</a>. This transition will happen in stages, depending on the economics of specific goods and customers, along with geographic and regulatory factors. Amazon of course has a huge advantage in logistics. Google has tried repeatedly to get into logistics with <a href="http://recode.net/2015/08/19/google-express-plans-to-shut-down-its-two-delivery-hubs/">little success</a>. On-demand ride-sharing and delivery startups could play an interesting role here. The logistics layer is critical for e-commerce, which in turn is critical for monetizing search. Amazon’s dominance in logistics gives it a very strong strategic moat as e-commerce continues to take market share from traditional retail.</p>
<p><strong>Web vs apps</strong>. The mobile web <a href="/2014/04/07/the-decline-of-the-mobile-web/">is</a> <a href="http://daringfireball.net/2014/04/rethinking_what_we_mean_by_mobile_web">arguably</a> in decline: users are spending more time on mobile devices, and more time in apps instead of web browsers. Apple has joined the app side of this battle (e.g. allowing ad blockers in Safari, encouraging app install <a href="https://developer.apple.com/library/ios/documentation/AppleApplications/Reference/SafariWebContent/PromotingAppswithAppBanners/PromotingAppswithAppBanners.html">smart banners</a> above websites). Facebook has also taken the app side (e.g. encouraging publishers to use <a href="https://instantarticles.fb.com/">Instant Articles</a> instead of web views). Google of course needs a vibrant web for its search engine to remain useful, so has joined the web side of the battle (e.g. <a href="http://techcrunch.com/2015/09/01/death-to-app-install-interstitials/">punishing websites</a> that have interstitial app ads, developing <a href="https://www.ampproject.org/">technologies</a> that reduce website loading times). The realistic danger isn’t that the web disappears, but that it gets marginalized, and that the bulk of monetizable internet activities happen in apps or other interfaces like voice or messaging bots. This shift could have a significant effect on web publishers who rely on older business models like non-native ads, and could make it harder for small startups to grow beyond niche use cases.</p>
<p><strong>Video: from TV to mobile devices.</strong> Internet companies are betting that video consumption will continue to shift from TV to mobile devices. The hope is that this will not only create compelling user experiences, but also unlock access to the tens of billions of ad dollars that are currently spent on TV.</p>
<blockquote>
<p>“I think video is a mega trend, almost as big as mobile.” — <a href="https://twitter.com/cdixon/status/706198805922902018">Mark Zuckerberg</a></p>
</blockquote>
<p>Last decade, the internet won the market for ads that harvest purchasing intent (ads that used to appear in newspapers and yellow pages), with most of the winnings going to Google. The question for the next decade is who will win the market for ads that generate purchasing intent (so far the winner is Facebook, followed by Google). Most likely this will depend on who controls the user flow to video advertising. Today, the biggest video platforms are Facebook and YouTube, but expect video to get embedded into almost every internet service, similar to how the internet transitioned from text-heavy to image-heavy services last decade.</p>
<p><strong>Voice: baking search into the OS.</strong> Voice bots like Siri, Google Now, and Alexa embed search-like capabilities directly into the operating system. Today, the quality of voice interfaces isn’t good enough to replace visual computing interfaces for most activities. However, artificial intelligence is <a href="https://medium.com/software-is-eating-the-world/what-s-next-in-computing-e54b870b80cc#.kyn1qnbvj">improving</a> rapidly. Voice bots should be be able to handle much more nuanced and interactive conversations in the near future.</p>
<p>Amazon’s <a href="https://developer.amazon.com/public/solutions/alexa/alexa-voice-service">vision</a> here is the most ambitious: to embed voice services in every possible device, thereby reducing the importance of the device, OS, and application layers (it’s no coincidence that those are also the layers in which Amazon is the weakest). But all the big tech companies are investing heavily in voice and AI. As Google CEO Sundar Pichai recently <a href="https://googleblog.blogspot.com/2016/04/this-years-founders-letter.html">said</a>:</p>
<blockquote>
<p>The next big step will be for the very concept of the “device” to fade away. Over time, the computer itself — whatever its form factor — will be an intelligent assistant helping you through your day. We will move from mobile first to an AI first world.</p>
</blockquote>
<p>This would mean that AI interfaces — which in most cases will mean voice interfaces — could become the master routers of the internet economic loop, rendering many of the other layers interchangeable or irrelevant. Voice is mostly a novelty today, but in technology the <a href="/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy/">next big thing</a> often starts out looking that way.</p>
]]></content:encoded>
</item>
<item>
<title>What’s Next in Computing?</title>
<link>https://cdixon.org/2016/02/21/what-s-next-in-computing/</link>
<guid>https://cdixon.org/2016/02/21/what-s-next-in-computing/</guid>
<pubDate>Sun, 21 Feb 2016 00:00:00 GMT</pubDate>
<description>The computing industry progresses in two mostly independent cycles: financial and product cycles. There has been a lot of handwringing lately about where we are in the financial cycle. Financial ...</description>
<content:encoded><![CDATA[<p>The computing industry progresses in two mostly independent cycles: financial and product cycles. There has been a lot of handwringing lately about where we are in the financial cycle. Financial markets get a lot of attention. They tend to fluctuate unpredictably and sometimes wildly. The product cycle by comparison gets relatively little attention, even though it is what actually drives the computing industry forward. We can try to understand and predict the product cycle by studying the past and extrapolating into the future.</p>
<p><img src="images/1_Gzmn-yCmeOGEVPrrq9esMA.png" alt="New computing eras have occurred every 10–15 years"></p>
<p>Tech product cycles are mutually reinforcing interactions between platforms and applications. New platforms enable new applications, which in turn make the new platforms more valuable, creating a positive feedback loop. Smaller, offshoot tech cycles happen all the time, but every once in a while — historically, about every 10 to 15 years — major new cycles begin that completely reshape the computing landscape.</p>
<p><img src="images/1_oOZjdUvjYRlrFtYUKLIMGg.png" alt="Financial and product cycles evolve mostly independently"></p>
<p>The PC enabled entrepreneurs to create word processors, spreadsheets, and many other desktop applications. The internet enabled search engines, e-commerce, e-mail and messaging, social networking, SaaS business applications, and many other services. Smartphones enabled mobile messaging, mobile social networking, and on-demand services like ride sharing. Today, we are in the middle of the mobile era. It is likely that many more mobile innovations are still to come.</p>
<p>Each product era can be divided into two phases: 1) <em>the gestation phase</em>, when the new platform is first introduced but is expensive, incomplete, and/or difficult to use, 2) <em>the growth phase</em>, when a new product comes along that solves those problems, kicking off a period of exponential growth.</p>
<p>The Apple II was released in 1977 (and the Altair in 1975), but it was the release of the IBM PC in 1981 that kicked off the PC growth phase.</p>
<p><img src="images/1_vfatwon6YWQGRvYad2ggqw.png" alt="PC sales per year (thousands), source: http://jeremyreimer.com/m-item.lsp?i=137"></p>
<p>The internet’s gestation phase took place in the <a href="https://en.wikipedia.org/wiki/National_Science_Foundation_Network">80s and early 90s</a> when it was mostly a text-based tool used by academia and government. The release of the Mosaic web browser in 1993 started the growth phase, which has continued ever since.</p>
<p><img src="images/1_6jgrfjHpBKlObla1x0NYtg.png" alt="Worldwide internet users, source: http://churchm.ag/numbers-internet-use/"></p>
<p>There were feature phones in the 90s and early smartphones like the Sidekick and Blackberry in the early 2000s, but the smartphone growth phase really started in 2007–8 with the release of the iPhone and then Android. Smartphone adoption has since exploded: about 2B people have smartphones today. By 2020, <a href="http://ben-evans.com/benedictevans/2014/10/28/presentation-mobile-is-eating-the-world">80% of the global population</a> will have one.</p>
<p><img src="images/1_8o0-IQSyDQ0KRxSVV2njdA.png" alt="Worldwide smartphone sales per year (millions)"></p>
<p>If the 10–15 year pattern repeats itself, the next computing era should enter its growth phase in the next few years. In that scenario, we should already be in the gestation phase. There are a number of important trends in both hardware and software that give us a glimpse into what the next era of computing might be. Here I talk about those trends and then make some suggestions about what the future might look like.</p>
<h2>Hardware: small, cheap, and ubiquitous</h2>
<p>In the mainframe era, only large organizations could afford a computer. Minicomputers were affordable by smaller organization, PCs by homes and offices, and smartphones by individuals.</p>
<p><img src="images/1_gZQE6-shm1dqgJAbmNn6ww.png" alt="Computers are getting steadily smaller, source: http://www.nature.com/news/the-chips-are-down-for-moore-s-law-1.19338"></p>
<p>We are now entering an era in which processors and sensors are getting so small and cheap that there will be many more computers than there are people.</p>
<p>There are two reasons for this. One is the steady progress of the semiconductor industry over the past 50 years (<a href="https://en.wikipedia.org/wiki/Moore%27s_law">Moore’s law</a>). The second is what Chris Anderson <a href="http://foreignpolicy.com/2013/04/29/epiphanies-from-chris-anderson/">calls</a> “the peace dividend of the smartphone war”: the runaway success of smartphones led to massive investments in processors and sensors. If you disassemble a modern drone, VR headset, or IoT devices, you’ll find mostly smartphone components.</p>
<p>In the modern semiconductor era, the focus has shifted from standalone CPUs to <a href="https://medium.com/@magicsilicon/how-the-soc-is-displacing-the-cpu-49bc7503edab#.h6wfmbk8n">bundles</a> of specialized chips known as systems-on-a-chip.</p>
<p><img src="images/1_SwUUpb2cjLIPFa3-8U9LzQ.png" alt="Computer prices have been steadily dropping, souce: https://medium.com/@magicsilicon/computing-transitions-22c07b9c457a#.j4cm9m6qu%5C"></p>
<p>Typical systems-on-a-chip bundle energy-efficient ARM CPUs plus specialized chips for graphics processing, communications, power management, video processing, and more.</p>
<p><img src="images/1_Wz-CMXmQFd64yFKWFfHefQ.jpeg" alt="Raspberry Pi Zero: 1 GHz Linux computer for $5"></p>
<p>This new architecture has dropped the price of basic computing systems from about $100 to about $10. The <a href="https://www.raspberrypi.org/blog/raspberry-pi-zero/">Raspberry Pi Zero</a> is a 1 GHz Linux computer that you can buy for $5. For a similar price you can buy a <a href="http://makezine.com/2015/04/01/esp8266-5-microcontroller-wi-fi-now-arduino-compatible/">wifi-enabled microcontroller</a> that runs a version of Python. Soon these chips will cost less than a dollar. It will be cost-effective to embed a computer in almost anything.</p>
<p>Meanwhile, there are still impressive performance improvements happening in high-end processors. Of particular importance are GPUs (graphics processors), the best of which are made by Nvidia. GPUs are useful not only for traditional graphics processing, but also for machine learning algorithms and virtual/augmented reality devices. Nvidia’s <a href="http://www.extremetech.com/gaming/201417-nvidias-2016-roadmap-shows-huge-performance-gains-from-upcoming-pascal-architecture">roadmap</a> promises significant performance improvements in the coming years.</p>
<p><img src="images/1_jSQ-qKGSVgW4rSwA0dk9ZQ.png" alt="Google’s quantum computer, source: https://www.technologyreview.com/s/544421/googles-quantum-dream-machine/"></p>
<p>A wildcard technology is quantum computing, which today exists mostly in laboratories but if made commercially viable could lead to orders-of-magnitude performance improvements for certain classes of algorithms in fields like biology and artificial intelligence.</p>
<h2>Software: the golden age of AI</h2>
<p>There are many exciting things happening in software today. Distributed systems is one good example. As the number of devices has grown exponentially, it has become increasingly important to 1) parallelize tasks across multiple machines 2) communicate and coordinate among devices. Interesting distributed systems technologies include systems like <a href="http://hadoop.apache.org/">Hadoop</a> and <a href="https://amplab.cs.berkeley.edu/projects/spark-lightning-fast-cluster-computing/">Spark</a> for parallelizing big data problems, and Bitcoin/blockchain for securing data and assets.</p>
<p>But perhaps the most exciting software breakthroughs are happening in artificial intelligence (AI). AI has a long history of hype and disappointment. Alan Turing himself <a href="http://loebner.net/Prizef/TuringArticle.html">predicted</a> that machines would be able to successfully imitate humans by the year 2000. However, there are good reasons to think that AI might now finally be entering a golden age.</p>
<blockquote>
<p>“Machine learning is a core, transformative way by which we’re rethinking everything we’re doing.” — Google CEO, Sundar Pichai</p>
</blockquote>
<p>A lot of the excitement in AI has focused on deep learning, a machine learning technique that was <a href="http://www.nytimes.com/2012/06/26/technology/in-a-big-network-of-computers-evidence-of-machine-learning.html?pagewanted=all">popularized</a> by a now famous 2012 Google project that used a giant cluster of computers to learn to identify cats in YouTube videos. Deep learning is a descendent of neural networks, a technology that <a href="https://en.wikipedia.org/wiki/Artificial_neural_network#History">dates back</a> to the 1940s. It was brought back to life by a <a href="http://www.wired.com/2014/10/future-of-artificial-intelligence/">combination</a> of factors, including new algorithms, cheap parallel computation, and the widespread availability of large data sets.</p>
<p><img src="images/1_P4BXse9pJYAUbasCEkQanA.png" alt="ImageNet challenge error rates, souce: http://www.slideshare.net/nervanasys/sd-meetup-12215 (red line = human performance)"></p>
<p>It’s tempting to dismiss deep learning as another Silicon Valley buzzword. The excitement, however, is supported by impressive theoretical and real-world results. For example, the error rates for the winners of the <a href="http://image-net.org/challenges/LSVRC/2015/">ImageNet challenge</a> — a popular machine vision contest — were in the 20–30% range prior to the use of deep learning. Using deep learning, the accuracy of the winning algorithms has steadily improved, and in 2015 surpassed human performance.</p>
<p>Many of the papers, <a href="https://code.google.com/archive/p/word2vec/">data</a> <a href="http://image-net.org/download-images">sets</a>, and <a href="https://www.tensorflow.org/">software</a> <a href="http://deeplearning.net/software/theano/">tools</a> related to deep learning have been open sourced. This has had a democratizing effect, allowing individuals and small organizations to build powerful applications. WhatsApp was able to build a global messaging system that <a href="http://www.wired.com/2015/09/whatsapp-serves-900-million-users-50-engineers/">served 900M users with just 50 engineers</a>, compared to the thousands of engineers that were needed for prior generations of messaging systems. This “<a href="https://twitter.com/cdixon/status/473221599189954562">WhatsApp effect</a>” is now happening in AI. Software tools like <a href="http://deeplearning.net/software/theano/">Theano</a> and <a href="https://www.tensorflow.org/">TensorFlow</a>, combined with cloud data centers for training, and inexpensive GPUs for deployment, allow small teams of engineers to build state-of-the-art AI systems.</p>
<p>For example, here a <a href="http://tinyclouds.org/colorize/">solo programmer</a> working on a side project used TensorFlow to colorize black-and-white photos:</p>
<p><img src="images/1_L6cT-HQMC-mc34kJ450pdA.png" alt="Left: black and white. Middle: automatically colorized. Right: true color. source: http://tinyclouds.org/colorize/"></p>
<p>And here a small startup created a real-time object classifier:</p>
<p><img src="images/1_cAtej8oZh2u80cii--YgTw.gif" alt="Teradeep real-time object classifier, source: https://www.youtube.com/watch?v=_wXHR-lad-Q "></p>
<p>Which of course is reminiscent of a famous scene from a sci-fi movie:</p>
<p><img src="images/1_wiG-xc456HpdBkRTQi84Eg.gif" alt="The Terminator (1984), source: https://www.youtube.com/watch?v=YvRb9jZ9wFk"></p>
<p>One of the first applications of deep learning released by a big tech company is the search function in Google Photos, which is <a href="http://gizmodo.com/google-photos-hands-on-so-good-im-creeped-out-1707566376">shockingly</a> smart.</p>
<p><img src="images/1_N1K_Wv2M-QDMF7FeOmJfcw.gif" alt="User searches photos (w/o metadata) for “big ben”"></p>
<p>We’ll soon see significant upgrades to the intelligence of all sorts of products, including: voice assistants, search engines, <a href="http://www.wired.com/2015/08/how-facebook-m-works/">chat bots</a>, 3D <a href="https://www.google.com/atap/project-tango/">scanners</a>, language translators, automobiles, drones, medical imaging systems, and much more.</p>
<blockquote>
<p>The business plans of the next 10,000 startups are easy to forecast: Take X and add AI. This is a big deal, and now it’s here. — <a href="http://www.wired.com/2014/10/future-of-artificial-intelligence/">Kevin Kelly</a></p>
</blockquote>
<p>Startups building AI products will need to stay laser focused on specific applications to compete against the big tech companies who have made AI a top priority. AI systems get better as more data is collected, which means it’s possible to create a virtuous flywheel of <a href="http://mattturck.com/2016/01/04/the-power-of-data-network-effects/">data network effects</a> (more users → more data → better products → more users). The mapping startup Waze <a href="https://digit.hbs.org/submission/waze-generating-better-maps-through-its-network-of-users/">used</a> data network effects to produce better maps than its vastly better capitalized competitors. Successful AI startups will follow a <a href="/2015/02/01/the-ai-startup-idea-maze/">similar</a> strategy.</p>
<h2>Software + hardware: the new computers</h2>
<p>There are a variety of new computing platforms currently in the gestation phase that will soon get much better — and possibly enter the growth phase — as they incorporate recent advances in hardware and software. Although they are designed and packaged very differently, they share a common theme: they give us new and augmented abilities by embedding a smart virtualization layer on top of the world. Here is a brief overview of some of the new platforms:</p>
<p><strong>Cars</strong>. Big tech companies like Google, Apple, Uber, and Tesla are investing significant resources in autonomous cars. Semi-autonomous cars like the Tesla Model S are already publicly available and will improve quickly. Full autonomy will take longer but is probably not more than 5 years away. There already exist fully autonomous cars that are almost as good as human drivers. However, for cultural and regulatory reasons, fully autonomous cars will likely need to be significantly better than human drivers before they are widely permitted.</p>
<p><img src="images/1_nJjPHXo_qBtzvoH8OLx9hQ.gif" alt="Autonomous car mapping its environment"></p>
<p>Expect to see a lot more investment in autonomous cars. In addition to the big tech companies, the big auto makers <a href="http://www.cnet.com/roadshow/news/gm-new-team-electric-autonomous-cars/">are</a> <a href="http://spectrum.ieee.org/automaton/robotics/industrial-robots/toyota-to-invest-1-billion-in-ai-and-robotics-rd">starting</a> <a href="https://media.ford.com/content/fordmedia/fna/us/en/news/2016/01/05/ford-tripling-autonomous-vehicle-development-fleet--accelerating.html">to</a> take autonomy very seriously. You’ll even see some interesting products made by startups. Deep learning software tools have gotten so good that a <a href="http://www.bloomberg.com/features/2015-george-hotz-self-driving-car/">solo programmer</a> was able to make a semi-autonomous car:</p>
<p><img src="images/1_z442b_u8RfSqBEyI-1AkxQ.gif" alt="Homebrew self-driving car, source: https://www.youtube.com/watch?v=KTrgRYa2wbI"></p>
<p><strong>Drones</strong>. Today’s consumer drones contain modern hardware (mostly smartphone components plus mechanical parts), but relatively simple software. In the near future, we’ll see drones that incorporate advanced computer vision and other AI to make them safer, easier to pilot, and more useful. Recreational videography will continue to be popular, but there will also be important <a href="http://www.airware.com">commercial</a> use cases. There are tens of millions of <a href="http://www.psmag.com/politics-and-law/cell-tower-climbers-die-78374">dangerous</a> jobs that involve climbing buildings, towers, and other structures that can be performed much more safely and effectively using drones.</p>
<p><img src="images/1_N7SlK3WKwkfZ6v50JFLkCg.gif" alt="Fully autonomous drone flight. source: https://www.youtube.com/watch?v=rYhPDn48-Sg"></p>
<p><strong>Internet of Things</strong>. The obvious use cases for IoT devices are energy savings, security, and convenience. <a href="https://nest.com/thermostat/meet-nest-thermostat/">Nest</a> and <a href="https://nest.com/camera/meet-nest-cam/">Dropcam</a> are popular examples of the first two categories. One of the most interesting products in the convenience category is Amazon’s <a href="http://www.amazon.com/Amazon-SK705DI-Echo/dp/B00X4WHP5E">Echo</a>.</p>
<p><img src="images/1_bsxhmUfI-7biIF-dW8a80w.png" alt="Three main uses cases for IoT"></p>
<p>Most people think Echo is a gimmick until they try it and then they are <a href="http://qz.com/611026/amazon-echo-is-a-sleeper-hit-and-the-rest-of-america-is-about-find-out-about-it-for-the-first-time/">surprised</a> at how useful it is. It’s a great <a href="https://500ish.com/alexa-5f7924bffcf3#.iou9jsaj4">demo</a> of how effective always-on voice can be as a user interface. It will be a while before we have bots with generalized intelligence that can carry on full conversations. But, as Echo shows, voice can succeed today in constrained contexts. Language understanding should improve quickly as recent breakthroughs in deep learning make their way into production devices.</p>
<p>IoT will also be adopted in business contexts. For example, devices with sensors and network connections are extremely <a href="https://www.samsara.com/">useful</a> for monitoring industrial equipment.</p>
<p><strong>Wearables.</strong> Today’s wearable computers are constrained along multiple dimensions, including battery, communications, and processing. The ones that have succeeded have focused on narrow applications like fitness monitoring. As hardware components continue to improve, wearables will support rich applications the way smartphones do, unlocking a wide range of new applications. As with IoT, voice will probably be the main user interface.</p>
<p><img src="images/1__4r-bIpz7jWMYiLnxKFCJQ.gif" alt="Wearable, super intelligent AI earpiece in the movie “Her”"></p>
<p><strong>Virtual Reality.</strong> 2016 is an exciting year for VR: the launch of the <a href="https://www.oculus.com/en-us/rift/">Oculus Rift</a> and HTC/Valve <a href="https://www.htcvive.com/us/">Vive</a> (and, possibly, the Sony Playstation VR), means that comfortable and immersive VR systems will finally be publicly available. VR systems need to be really good to avoid the “<a href="https://en.wikipedia.org/wiki/Uncanny_valley">uncanny valley</a>” trap. Proper VR requires special screens (high resolution, high refresh rate, low persistence), powerful graphics cards, and the ability to track the precise position of the user (previously released VR systems could only track the rotation of the user’s head). This year, the public will for the first time get to experience what is known as “<a href="http://a16z.com/2015/01/22/virtual-reality/">presence</a>” — when your senses are sufficiently tricked that you feel fully transported into the virtual world.</p>
<p><img src="images/1_bcHvjQwlLxyORwjHFH87Qg.gif" alt="Oculus Rift Toybox demo"></p>
<p>VR headsets will continue to improve and get more affordable. Major areas of research will include: 1) new tools for creating rendered and/or <a href="https://www.lytro.com/">filmed</a> VR content, 2) machine vision for <a href="http://venturebeat.com/2016/02/08/oculus-vr-guru-john-carmack-leads-crucial-position-tracking-development-for-mobile-vr/">tracking</a> and scanning directly from phones and headsets, and 3) distributed back-end <a href="/2015/03/24/improbable-enabling-the-development-of-large-scale-simulated-worlds/">systems</a> for hosting large <a href="https://twitter.com/cdixon/status/662836035508940800">virtual environments</a>.</p>
<p><img src="images/1_Fv9_4fCAOHoEA3dxjMf2jw.gif" alt="3D world creation in room-scale VR"></p>
<p><strong>Augmented Reality</strong>. AR will likely arrive after VR because AR requires most of what VR requires plus additional new technologies. For example, AR requires advanced, low-latency machine vision in order to convincingly combine real and virtual objects in the same interactive scene.</p>
<p><img src="images/1_HpWBUZD_kKAoTa2yuxqnTQ.jpeg" alt="Real and virtual combined (from The Kingsmen)"></p>
<p>That said, AR is probably coming sooner than you think. This demo video was shot directly through <a href="http://www.magicleap.com/#/home">Magic Leap’s</a> AR device:</p>
<p><img src="images/1_7jbz4N1GZTFm0wDzDEmQ1Q.gif" alt="Magic Leap demo: real environment, virtual character"></p>
<h2>What’s next?</h2>
<p>It is possible that the pattern of 10–15 year computing cycles has ended and mobile is the final era. It is also possible the next era won’t arrive for a while, or that only a subset of the new computing categories discussed above will end up being important.</p>
<p>I tend to think we are on the cusp of not one but multiple new eras. The “peace dividend of the smartphone war” created a Cambrian explosion of new devices, and developments in software, especially AI, will make those devices smart and useful. Many of the futuristic technologies discussed above exist today, and will be broadly accessible in the near future.</p>
<p>Observers have noted that many of these new devices are in their “<a href="http://www.nytimes.com/2016/01/07/technology/on-display-at-ces-tech-ideas-in-their-awkward-adolescence.html?_r=0">awkward adolescence</a>.” That is because they are in their gestation phase. Like PCs in the 70s, the internet in the 80s, and smartphones in the early 2000s, we are seeing pieces of a future that isn’t quite here. But the future is coming: markets go up and down, and excitement ebbs and flows, but computing technology marches steadily forward.</p>
]]></content:encoded>
</item>
</channel>
</rss>
{
"access-control-allow-origin": "*",
"cache-control": "public, max-age=0, must-revalidate",
"cf-alt-svc": "{\"http3\":true}",
"cf-as-number": "132892",
"cf-cache-status": "DYNAMIC",
"cf-ray": "9dc5cf9b8bdec424-CMH",
"cf-zone-data": "{\"owner\":\"2994384\",\"zone\":\"1151592827\"}",
"content-length": "169532",
"content-type": "application/xml",
"date": "Sat, 14 Mar 2026 19:51:04 GMT",
"etag": "\"6e4193a917c1ce8b47d7c3a64ad7b9f0\"",
"nel": "{\"report_to\":\"cf-nel\",\"success_fraction\":0.0,\"max_age\":604800}",
"referrer-policy": "strict-origin-when-cross-origin",
"report-to": "{\"group\":\"cf-nel\",\"max_age\":604800,\"endpoints\":[{\"url\":\"https://a.nel.cloudflare.com/report/v4?s=1%2FQdOXRJaXBzAOSEJpxRNJRFT6W05jeRa8b7cjxtw63EZjtyuLhbvd6dSde4SSv25FnPYGswNKLvsjJ%2BJQHrDn%2FJBNBM9GX363V%2B3kxwbHZ7GXLpgy0%3D\"}]}",
"server": "cloudflare",
"x-content-type-options": "nosniff"
}
{
"meta": {
"type": "rss",
"version": "2.0"
},
"language": null,
"title": "cdixon",
"description": "programming, philosophy, history, internet, startups",
"copyright": null,
"url": "https://cdixon.org",
"self": "https://cdixon.org/rss.xml",
"published": null,
"updated": null,
"generator": null,
"image": null,
"authors": [],
"categories": [],
"items": [
{
"id": "https://cdixon.org/2023/06/22/read-write-own/",
"title": "I wrote a book: Read Write Own",
"description": "I wrote a book: Read Write Own I believe blockchains and the software movement around them – typically called crypto or web3 – provide the only plausible path to sustaining ...",
"url": "https://cdixon.org/2023/06/22/read-write-own/",
"published": "2023-06-22T00:00:00.000Z",
"updated": "2023-06-22T00:00:00.000Z",
"content": "<p>I wrote a book: <em>Read Write Own</em></p>\n<p>I believe blockchains and the software movement around them – typically called crypto or web3 – provide the only plausible path to sustaining the original vision of the internet as an open platform that incentivizes creativity and entrepreneurship. I’ve been investing behind this thesis for years, and advocating for it through writing and speaking and by talking to business leaders, journalists, and policymakers both here and around the world.</p>\n<p>Through all that, it became clear that we need a comprehensive book that clearly explains new technologies like blockchains and the services built on top of them; how they fit into the history of the internet; and why they should matter to founders, developers, creators, and anyone interested in the history and evolution of business, technology, and innovation.</p>\n<p>So I wrote that book: <em>Read Write Own: Building the Next Era of the Internet.</em></p>\n<p>My thesis is that seemingly small initial decisions around software and network design can have profound downstream consequences on the control and economics of digital services. The book walks through the history of the internet, showing how it has gone through three major design eras: the first focused on democratizing information (read), the second on democratizing publishing (write), and the third on democratizing ownership (own).</p>\n<p>We are on the cusp of the third era – own – so I explain the key concepts underlying it, including blockchains and digital services built on top of blockchains. The book therefore answers a common question I hear: “<em>What problems do blockchains solve?</em>” Blockchains solve the same problems that other digital services solve, but with better outcomes. They can connect people in social networks, while empowering users over corporate interests. They can underpin marketplaces and payment systems that facilitate commerce, but with persistently lower take rates. They can enable new forms of monetizable media, interoperable and immersive digital worlds, and artificial intelligence services that compensate – rather than cannibalize – creators and communities.</p>\n<p>The book takes controversial questions head on, including policy and regulatory topics, and the harmful “casino” culture that has developed around crypto that hurts public perception and undermines its potential. And I go deeper into intersecting topics like artificial intelligence, social networks, finance, media businesses, collaborative creation, video games, and virtual worlds.</p>\n<p>Inspired by modern tech classics like <em>Zero to One</em> and <em>The Hard Thing About Hard Things</em>, I wrote the book to be succinct, thorough, and accessible. I also distill cutting-edge thinking from technologists and founders to make it useful to practitioners. My goal was to make it accessible without watering it down. The book is meant for a range of audiences, including entrepreneurs, technologists, company leaders, policymakers, journalists, business thinkers, artists, community builders, and people who are simply curious about new technologies, culture, and the future of the internet.</p>\n<p>I love reading books but believe that tech and business topics usually work better in shorter formats, which is why in the past I’ve stuck to blogging and tweeting. But accomplishing all of the above warranted a longer treatment, bringing new and different ideas together in one place. So I spent much of the last year doing this. Many of the ideas I’ve thought about for a long time but never took the time to write.</p>\n<p><em>Read Write Own: Building the Next Era of the Internet</em> will be published by Random House on March 12, 2024. You can pre-order it <a href=\"https://readwriteown.com\">here</a>.</p>\n<p>Sign up for more book updates <a href=\"https://cdixon.substack.com\">here</a>.</p>\n<hr>\n<p><a href=\"https://readwriteown.com/terminologyhistory/\">More about the term and title “Read Write Own” here.</a></p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2021/02/27/NFTs-and-a-thousand-true-fans/",
"title": "NFTs and A Thousand True Fans",
"description": "In his classic 2008 essay “1000 True Fans,” Kevin Kelly predicted that the internet would transform the economics of creative activities: To be a successful creator you don’t need millions. ...",
"url": "https://cdixon.org/2021/02/27/NFTs-and-a-thousand-true-fans/",
"published": "2021-02-27T00:00:00.000Z",
"updated": "2021-02-27T00:00:00.000Z",
"content": "<p align=\"center\"><img src=\"images/nfts.png\"/></p> \n<p>In his classic 2008 essay “<a href=\"https://kk.org/thetechnium/1000-true-fans/\">1000 True Fans</a>,” Kevin Kelly predicted that the internet would transform the economics of creative activities:</p>\n<blockquote>\n<p>To be a successful creator you don’t need millions. You don’t need millions of dollars or millions of customers, millions of clients or millions of fans. To make a living as a craftsperson, photographer, musician, designer, author, animator, app maker, entrepreneur, or inventor you need only thousands of true fans.</p>\n</blockquote>\n<blockquote>\n<p>A true fan is defined as a fan that will buy anything you produce. These diehard fans will drive 200 miles to see you sing; they will buy the hardback and paperback and audible versions of your book; they will purchase your next figurine sight unseen; they will pay for the “best-of” DVD version of your free YouTube channel; they will come to your chef’s table once a month.</p>\n</blockquote>\n<p>Kelly’s vision was that the internet was the ultimate matchmaker, enabling 21st century patronage. Creators, no matter how seemingly niche, could now discover their true fans, who would in turn demonstrate their enthusiasm through direct financial support.</p>\n<p>But the internet took a detour. Centralized social platforms became the dominant way for creators and fans to connect. The platforms used this power to become the new intermediaries — inserting ads and algorithmic recommendations between creators and users while keeping most of the revenue for themselves.</p>\n<p>The good news is that the internet is trending back to Kelly’s vision. For example, many top writers on Substack earn far more than they did at salaried jobs. The economics of low take rates plus enthusiastic fandom does wonders. On Substack, 1,000 newsletter subscribers paying $10/month nets over $100K/year to the writer.</p>\n<p>Crypto, and specifically <a href=\"https://variant.mirror.xyz/T8kdtZRIgy_srXB5B06L8vBqFHYlEBcv6ae2zR6Y_eo\">NFTs</a> (non-fungible tokens), can accelerate the trend of creators monetizing directly with their fans. Social platforms will continue to be useful for building audiences (although these too should probably be replaced with superior <a href=\"https://cdixon.org/2018/02/18/why-decentralization-matters\">decentralized</a> alternatives), but creators can increasingly rely on other methods including NFTs and crypto-enabled economies to make money.</p>\n<p>NFTs are blockchain-based records that uniquely represent pieces of media. The media can be anything digital, including art, videos, music, gifs, games, text, memes, and code. NFTs contain highly trustworthy documentation of their history and origin, and can have code attached to do almost anything programmers dream up (one popular feature is code that ensures that the original creator receives royalties from secondary sales). NFTs are secured by the same technology that enabled Bitcoin to be owned by hundreds of millions of people around the world and represent hundreds of billions of dollars of value.</p>\n<p>NFTs have received a lot of attention lately because of high sales volumes. In the past 30 days there has been over <a href=\"http://cryptoslam.io\">$300M</a> in NFT sales:</p>\n<p align=\"center\"><img src=\"images/pic1.png\"/></p> \n<p>Crypto has a history of boom and bust cycles, and it’s very possible NFTs will have their own ups and downs.</p>\n<p>That said, there are three important reasons why NFTs offer fundamentally better economics for creators. The first, already alluded to above, is by removing rent-seeking intermediaries. The logic of blockchains is once you purchase an NFT it is yours to fully control, just like when you buy books or sneakers in the real world. There are and will continue to be NFT platforms and marketplaces, but they will be constrained in what they can charge because blockchain-based ownership shifts the power back to creators and users — you can shop around and force the marketplace to earn its fees. (Note that lowering the intermediary fees can have a multiplier effect on creator disposable income. For example, if you make $100K in revenue and have $80K in costs, cutting out a 50% take rate increases your revenue to $200K, multiplying your disposable income 6x, from $20K to $120K.)</p>\n<p>The second way NFTs change creator economics is by enabling granular price tiering. In ad-based models, revenue is generated more or less uniformly regardless of the fan’s enthusiasm level. As with Substack, NFTs allow the creator to “cream skim” the most passionate users by offering them special items which cost more. But NFTs go farther than non-crypto products in that they are easily sliced and diced into a descending series of pricing tiers. NBA Top Shot cards range from over $100K to a few dollars. Fan of Bitcoin? You can buy as much or little as you want, down to 8 decimal points, depending on your level of enthusiasm. Crypto’s fine-grained granularity lets creators capture a much larger area under the demand curve.</p>\n<p align=\"center\"><img src=\"images/pic2.png\"/></p> \n<p>The third and most important way NFTs change creator economics is by making users owners, thereby reducing customer acquisition costs to near zero. Open any tech S-1 filing and you’ll see massive user/customer acquisition costs, usually going to online ads or sales staff. Crypto, by contrast, has grown to over a trillion dollars in aggregate market capitalization with almost no marketing spend. Bitcoin and Ethereum don’t have organizations behind them let alone marketing budgets, yet are used, owned, and loved by tens of millions of people.</p>\n<p>The highest revenue NFT project to date, <a href=\"https://www.nbatopshot.com/\">NBA Top Shot</a>, has generated $200M in gross sales in just the past month while spending very little on marketing. It’s been able to grow so efficiently because users feel like owners — they have skin in the game. It’s true peer-to-peer marketing, fueled by community, <a href=\"https://twitter.com/ROSGO21/status/1364724500642689027?s=20\">excitement</a>, and ownership.</p>\n<p align=\"center\"><img src=\"images/pic3.jpg\"/></p> \n<p>NFTs are still early, and will evolve. Their utility will increase as digital experiences are built around them, including marketplaces, social networks, showcases, games, and virtual worlds. It’s also likely that other consumer-facing crypto products emerge that pair with NFTs. Modern video games like Fortnite contain sophisticated economies that mix fungible tokens like V-Bucks with NFTs/virtual goods like skins. Someday every internet community might have its own micro-economy, including NFTs and fungible tokens that users can use, own, and collect.</p>\n<p>The thousand true fans thesis builds on the original ideals of the internet: users and creators globally connected, unconstrained by intermediaries, sharing ideas and economic upside. Incumbent social media platforms sidetracked this vision by locking creators into a bundle of distribution and monetization. There are, correspondingly, two ways to challenge them: take the users, or take the money. Crypto and NFTs give us a new way to take the money. Let’s make it happen.</p>\n<p><em>(Image: CryptoPunks — Larva Labs)</em></p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2020/10/19/doing-old-things-better-vs-doing-brand-new-things/",
"title": "Doing old things better vs doing brand new things",
"description": "New technologies enable activities that fall into one of two categories: 1) doing things you could already do but can now do better because they are faster, cheaper, easier, higher ...",
"url": "https://cdixon.org/2020/10/19/doing-old-things-better-vs-doing-brand-new-things/",
"published": "2020-10-19T00:00:00.000Z",
"updated": "2020-10-19T00:00:00.000Z",
"content": "<p>New technologies enable activities that fall into one of two categories: 1) doing things you could already do but can now do better because they are faster, cheaper, easier, higher quality, etc. 2) doing brand new things that you simply couldn’t do before. Early in the development of new technologies, the first category tends to get more attention, but it’s the second that ends up having more impact on the world.</p>\n<p>Doing old things better tends to get more attention early on because it’s easier to imagine what to build. Early films were shot like plays — they were effectively plays with a better distribution model — until filmmakers realized that movies had their own visual grammar. The early electrical grid delivered light better than gas and candles. It took decades before we got an electricity “app store” — a rich ecosystem of appliances that connected to the grid. The early web was mostly digital adaptations of pre-internet things like letter writing and mail-order commerce. It wasn’t until the 2000s that entrepreneurs started exploring “internet native” ideas like social networking, crowdfunding, cryptocurrency, crowdsourced knowledge bases, and so on.</p>\n<p>The most common mistake people make when evaluating new technologies is to focus too much on the “doing old things better” category. For example, when evaluating the potential of blockchains, people sometimes focus on things like cheaper and faster global payments, which are important and necessary but only the beginning. What’s even more exciting are the new things you simply couldn’t create before, like internet services that are <a href=\"https://cdixon.org/2019/01/04/how-blockchain-can-wrest-the-internet-from-corporations\">owned and operated by their users</a> instead of by companies. Another example is business productivity apps architected as web services. Early products like Salesforce were easier to access and cheaper to maintain than their on-premise counterparts. Modern productivity apps like Google Docs, Figma, and Slack focus on things you simply couldn’t do before, like real-time collaboration and deep integrations with other apps.</p>\n<p>Entrepreneurs who create products in the “brand new things” category usually spend many years deeply immersed in the underlying technology before they have their key insights. The products they create often <a href=\"https://cdixon.org/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy\">start out looking toy-like</a>, <a href=\"https://cdixon.org/2019/01/08/strong-and-weak-technologies\">strange, unserious, expensive</a>, and sometimes even dangerous. Over time, the products steadily improve and the world gradually embraces them.</p>\n<p>It can take decades for this process to play out. It’s clear that we are early in the development of emerging technologies like cryptocurrencies, machine learning, and virtual reality. It is also possible we are still early in the development of more established technologies like mobile devices, cloud hosting, social networks, and perhaps even the internet itself. If so, new categories of native products built on top of these technologies will continue to be invented in the coming years.</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2020/01/26/computers-that-can-make-commitments/",
"title": "Computers that can make commitments",
"description": "Blockchains are computers that can make commitments. Traditional computers are ultimately controlled by people, either directly in the case of personal computers or indirectly through organizations. Blockchains invert this power ...",
"url": "https://cdixon.org/2020/01/26/computers-that-can-make-commitments/",
"published": "2020-01-26T00:00:00.000Z",
"updated": "2020-01-26T00:00:00.000Z",
"content": "<p>Blockchains are computers that can make commitments. Traditional computers are ultimately controlled by people, either directly in the case of personal computers or indirectly through organizations. Blockchains invert this power relationship, putting the code in charge. A game theoretic mechanism — a so-called consensus mechanism — makes blockchains resilient to modifications to their underlying physical components, effectively making them resilient to human intervention.</p>\n<p>As a result, a properly designed blockchain provides strong guarantees that the code it runs will continue to operate as designed. For the first time, a computer system can be truly autonomous: self-governed, by its own code, instead of by people. Autonomous computers can be relied on and trusted in ways that human-governed computers can’t.</p>\n<p>Computers that make commitments can be useful in finance. The most famous example of this is Bitcoin, which makes various commitments, including that there will never be more than 21 million bitcoins, a commitment that makes bitcoins scarce and therefore capable of being valuable. Without a blockchain, this commitment could have been made by a person or a business, but it is unlikely that other people would have really trusted that commitment, since people and businesses change their minds all the time. Prior to Bitcoin, besides precious metals which are naturally scarce, the only credible commitments to monetary scarcity came from governments.</p>\n<p>Ethereum was the first blockchain to support a general-purpose programming language, allowing for the creation of arbitrarily complex software that makes commitments. Two early applications built on Ethereum are <a href=\"https://compound.finance/\">Compound</a> and <a href=\"https://makerdao.com/en/\">Maker Dao</a>. Compound makes the commitment that it will act as a neutral, low-fee lending protocol. Maker Dao makes a commitment to maintain the price stability of a currency called Dai that can be used for stable payments and value store. As of today, users have locked up hundreds of millions of dollars in these applications, a testament to the credibility of their commitments.</p>\n<p>Applications like Compound and Maker can do things that pre-blockchain software simply couldn’t, such as hold funds that reside in the code itself, as opposed to traditional payment systems which only hold pointers to offline bank accounts. This removes the need to trust anything other than code, and makes the system end-to-end transparent and extensible. Blockchain applications do this autonomously — every human involved in creating these projects could disappear and the software would go on doing what it does, keeping its commitments, indefinitely.</p>\n<p>What else can you do with computers that make commitments? One fertile area being explored is re-architecting popular internet services like social networks and marketplaces so that they make strong, positive commitments to their communities. For example, users can get commitments baked into the code that their data will be kept private and that they won’t get de-platformed without due process. Third-party developers can safely invest in their businesses knowing that the rules are baked into the network and can’t change, protecting them from <a href=\"https://cdixon.org/2018/02/18/why-decentralization-matters\">platform risk</a>. Using the financial features of blockchains, users and developers can receive tokens in order to participate in the upside of the network as it grows.</p>\n<p>Blockchains have arrived at an opportune time. Internet services have become central to our economic, political, and cultural lives, yet the trust between users and the people who run these services is breaking down. At the same time, industries like finance that have traditionally depended on trust have resisted modernization. The next few years will be exciting — we are only beginning to explore the <a href=\"https://cdixon.org/2013/08/04/the-idea-maze\">idea maze</a> unlocked by this new kind of computer.</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2020/01/17/inside-out-vs-outside-in/",
"title": "Inside-out vs. outside-in: the adoption of new technologies",
"description": "There are broadly two adoption paths for new computing technologies: inside-out and outside-in. Inside-out technologies are pioneered by established institutions and later proliferate outward to the mainstream. Apple (followed by ...",
"url": "https://cdixon.org/2020/01/17/inside-out-vs-outside-in/",
"published": "2020-01-17T00:00:00.000Z",
"updated": "2020-01-17T00:00:00.000Z",
"content": "<p>There are broadly two adoption paths for new computing technologies: inside-out and outside-in. Inside-out technologies are pioneered by established institutions and later proliferate outward to the mainstream. Apple (followed by Google and others) pioneered the modern touchscreen smartphone, university and corporate research labs pioneered machine learning, and big tech companies like Amazon pioneered cloud computing.</p>\n<p>Outside-in technologies, by contrast, start out on the fringes and only later move inward to established institutions. Open-source software started out as a niche anti-copyright movement. The web was invented at a physics lab and then built out by hobbyists and entrepreneurs. Social media began as a movement of idealistic blogging enthusiasts.</p>\n<p>Inside-out technologies tend to require significant capital and formally trained technical expertise. They also tend to be technologies that most people would recognize as valuable even before they exist. It wasn’t very hard to imagine that affordable, easy-to-use, internet-connected pocket supercomputers would be popular, or that machines that could learn to behave intelligently could do all sorts of useful tasks.</p>\n<p>Outside-in technologies tend to require less capital and less formally trained technical skills, creating a much more level playing field between insiders and outsiders. In many cases the value of outside-in technologies is not only unclear before they’re invented, but remains unclear for many years after they’re invented.</p>\n<p>Take the example of social media. Early on, blogs and services like Twitter were mostly used to discuss niche tech topics and share mundane personal events. This led many sophisticated observers to <a href=\"https://www.nytimes.com/2007/04/22/business/yourmoney/22stream.html\">dismiss</a> them as toys or passing fads. At its core, however, social media was about the creation of curated information networks. Today, this is easy to see – billions of people rely on services like Twitter and Facebook for their news – but back then you had to cut through the noise generated by the eccentricities of early adopters. Social media is a technology for creating global media networks that arrived disguised as a way to share what you had for lunch.</p>\n<p>Both inside-out and outside-in technologies are important, and in fact they’re often mutually reinforcing. Mobile, social, and cloud powered the growth of computing over the last decade: mobile (inside-out) brought computers to billions of people, social (outside-in) drove usage and monetization, and cloud (inside-out) allowed back-end services to scale. Most likely the next major wave in computing will also be driven by a mutually reinforcing combination of technologies, some developed at established institutions and some developed by enthusiastic and possibly misunderstood outsiders.</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2019/01/08/strong-and-weak-technologies/",
"title": "Strong and weak technologies",
"description": "During a media tour in 2007 in which Steve Jobs showed the device to reporters, there was one instance in which a journalist criticized the iPhone’s touch-screen keyboard. “It doesn’t ...",
"url": "https://cdixon.org/2019/01/08/strong-and-weak-technologies/",
"published": "2019-01-08T00:00:00.000Z",
"updated": "2019-01-08T00:00:00.000Z",
"content": "<blockquote>\n<p><em>During a <a href=\"https://www.businessinsider.com/steve-jobs-reaction-first-iphone-2015-9\">media tour</a> in 2007 in which Steve Jobs showed the device to reporters, there was one instance in which a journalist criticized the iPhone’s touch-screen keyboard.</em></p>\n<p><em>“It doesn’t work,” the reporter said.</em></p>\n<p><em>Jobs stopped for a moment and tilted his head. The reporter said he or she kept making typos and the keys were too small for his or her thumbs.</em></p>\n<p><em>Jobs smiled and then replied: “Your thumbs will learn.”</em></p>\n</blockquote>\n<p>When the iPhone was introduced in 2007, it <a href=\"https://www.wsj.com/articles/behind-the-rise-and-fall-of-blackberry-1432311912\">mystified</a> its competitors, because it wasn’t built for the world as it existed. Wireless networks were too slow. Smartphone users only knew how to use physical keyboards. There were no software developers making apps for touchscreen phones. It frequently dropped phone calls.</p>\n<p>But the iPhone was such a remarkable device — fans called it “The Jesus Phone” — that the world adapted to it. Carriers built more wireless capacity. Developers invented new apps and interfaces. Users learned how to rapidly type on touchscreens. Apple kept releasing better versions, fixing problems and adding new capabilities.</p>\n<p>Smartphones are a good example of a broader historical pattern: technologies usually arrive in pairs, a strong form and a weak form. Here are some examples:</p>\n<table class=\"comparison-table\">\n<thead>\n<tr><th>Strong</th><th>Weak</th></tr>\n</thead>\n<tbody>\n<tr><td>Public internet</td><td>Private intranets</td></tr>\n<tr><td>Consumer web</td><td>Interactive TV</td></tr>\n<tr><td>Crowdsourced encyclopedia (Wikipedia)</td><td>Expert-curated encyclopedia (e.g. Nupedia, Encarta)</td></tr>\n<tr><td>Crowdsourced video (YouTube)</td><td>Video tech for media companies (e.g. RealPlayer)</td></tr>\n<tr><td>Internet video chat (Skype)</td><td>Voice-over-IP (e.g. Vonage)</td></tr>\n<tr><td>Streaming music (Spotify)</td><td>MP3 downloads (e.g. iTunes)</td></tr>\n<tr><td>Touchscreen smartphones with full operating system and app store (iPhone)</td><td>Limited-app smartphones with physical keyboards (e.g. Blackberry)</td></tr>\n<tr><td>Fully electric cars (Tesla)</td><td>Hybrid cars</td></tr>\n<tr><td>Permissionless blockchains powered by cryptocurrencies</td><td>Permissioned/private blockchains</td></tr>\n<tr><td>Public cloud</td><td>Private / hybrid cloud</td></tr>\n<tr><td>App-based media companies (e.g. Netflix)</td><td>Video on demand delivered by cable companies</td></tr>\n<tr><td>Virtual realty</td><td>Augmented reality</td></tr>\n<tr><td>E-sports</td><td>Traditional sports delivered over the internet</td></tr>\n</tbody>\n</table>\n<p>Strong technologies capture the imaginations of technology enthusiasts. That is why many important technologies start out as weekend hobbies. <a href=\"https://cdixon.org/2013/03/03/what-the-smartest-people-do-on-the-weekend-is-what-everyone-else-will-do-during-the-week-in-ten-years/\">Enthusiasts vote with their time</a>, and, unlike most of the business world, have long-term horizons. They build from first principles, making full use of the available resources to design technologies as they ought to exist. Sometimes these enthusiasts run large companies, in which case they are often, like Steve Jobs, founders who have the gravitas and vision to make big, long-term bets.</p>\n<p>The mainstream technology world notices the excitement and wants to join in, but isn’t willing to go all the way and embrace the strong technology. To them, the strong technology appears to be some combination of strange, <a href=\"https://cdixon.org/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy/\">toy-like</a>, unserious, expensive, and sometimes even dangerous. So they embrace the weak form, a compromised version that seems more familiar, productive, serious, and safe.</p>\n<p>Strong technologies often develop according to the Perez/Gartner hype cycle:</p>\n<p><img src=\"images/researchmethodology-illustration-hype-cycle.jpg\" alt=\"\"></p>\n<p>During the trough of disillusionment, entrepreneurs and others who invested in strong technologies sometimes lose faith and switch their focus to weak technologies, because the weak technologies appear nearer to mainstream adoption. This is usually a mistake.</p>\n<p>That said, weak forms of technology can be successful. For example, it is very likely that augmented reality will be important, watching traditional sports on the internet will be popular, and so on.</p>\n<p>But it’s strong technologies that end up defining new eras. What George Bernard Shaw said about people also applies to technologies:</p>\n<blockquote>\n<p>The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.</p>\n</blockquote>\n<p>Weak technologies adapt to the world as it currently exists. Strong technologies adapt the world to themselves. Progress depends on strong technologies. Your thumbs will learn.</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2019/01/04/how-blockchain-can-wrest-the-internet-from-corporations/",
"title": "Who will control the software that powers the Internet?",
"description": "Originally published by Wired. As the internet has evolved over its 35-year lifespan, control over its most important services has gradually shifted from open source protocols maintained by non-profit communities ...",
"url": "https://cdixon.org/2019/01/04/how-blockchain-can-wrest-the-internet-from-corporations/",
"published": "2019-01-04T00:00:00.000Z",
"updated": "2019-01-04T00:00:00.000Z",
"content": "<p><em>Originally published by <a href=\"https://www.wired.com/story/how-blockchain-can-wrest-the-internet-from-corporations/\">Wired</a>.</em></p>\n<p>As the internet has evolved over its 35-year lifespan, control over its most important services has gradually shifted from open source protocols maintained by non-profit communities to proprietary services operated by large tech companies. As a result, billions of people got access to amazing, free technologies. But that shift also created serious problems.</p>\n<p>Millions of users have had their private data misused or stolen. Creators and businesses that rely on internet platforms are subject to sudden rule changes that take away their audiences and profits. But there is a growing movement—emerging from the blockchain and cryptocurrency world—to build new internet services that combine the power of modern, centralized services with the community-led ethos of the original internet. We should embrace it.</p>\n<p>From the 1980s through the early 2000s, the dominant internet services were built on open protocols that the internet community controlled. For example, the Domain Name System, the internet’s “phone book,” is controlled by a distributed network of people and organizations, using rules that are created and administered in the open. This means that anyone who adheres to community standards can own a domain name and establish an internet presence. It also means that the power of companies operating web and email hosting is kept in check—if they misbehave, customers can port their domain names to competing providers.</p>\n<p>From the mid 2000s to the present, trust in open protocols was replaced by trust in corporate management teams. As companies like Google, Twitter, and Facebook built software and services that surpassed the capabilities of open protocols, users migrated to these more sophisticated platforms. But their code was proprietary, and their governing principles could change on a whim.</p>\n<p>How do social networks decide which users to <a href=\"https://www.wired.com/story/how-right-wing-social-media-site-gab-got-back-online/\">verify</a> or <a href=\"https://www.wired.com/story/tumblrs-porn-ban-reveals-controls-we-see-online/\">ban</a>? How do search engines decide how to rank websites? One minute social networks court media organizations and small businesses, the next minute they de-prioritize their content or change the revenue split. The power of these platforms has created widespread societal tensions, as seen in debates over fake news, state-sponsored bots, privacy laws, and algorithmic biases.</p>\n<p>That’s why the pendulum is swinging back to an internet governed by open, community-controlled services. This has only recently become possible, thanks to technologies arising from the blockchain and cryptocurrencies.</p>\n<p>There has been a lot of talk in the past few years about blockchains, which are heavily hyped but poorly understood. Blockchains are networks of physical computers that work together in concert to form a single virtual computer. The benefit is that, unlike a traditional computer, a blockchain computer can offer strong trust guarantees, rooted in the mathematical and game-theoretic properties of the system. A user or developer can trust that a piece of code running on a blockchain computer will continue to behave as designed, even if individual participants in the network change their motivations or try to subvert the system. This means that the control of a blockchain computer can be placed in the hands of a community.</p>\n<p>Users who depend on proprietary platforms, on the other hand, have to worry about data getting stolen or misused, privacy policies changing, intrusive advertising, and more. Proprietary platforms may suddenly change the rules for developers and businesses, the way Facebook <a href=\"https://venturebeat.com/2016/06/30/facebook-kicked-zynga-to-the-curb-publishers-are-next/\">famously did to Zynga</a> and Google <a href=\"https://www.nytimes.com/2017/07/01/technology/yelp-google-european-union-antitrust.html\">did to Yelp</a>.</p>\n<p>The idea that corporate-owned services could be replaced by community-owned services may sound far-fetched, but there is a strong historical precedent in the transformation of software over the past twenty years. In the 1990s, computing was dominated by proprietary, closed-source software, most notably Windows. Today, billions of Android phones run on the open source operating system Linux. Much of the software running on an Apple device is open source, as is almost all modern cloud data centers including Amazon’s. The recent acquisitions of <a href=\"https://www.wired.com/story/microsofts-github-deal-is-its-latest-shift-from-windows/\">Github by Microsoft</a> and <a href=\"https://www.wired.com/story/ibm-buying-open-source-specialist-red-hat-34-billion/\">Red Hat by IBM</a> underscore how dominant open source has become.</p>\n<p>As open source has grown in importance, technology companies have shifted their business models from selling software to delivering cloud-based services. Google, Facebook, Amazon, and Netflix are all services companies. Even Microsoft is now primarily a services company. This has allowed these companies to outpace the growth of open source software and maintain control of critical internet infrastructure.</p>\n<p>A core insight in the design of blockchains is that the open source model can be extended beyond software to cloud-based services by adding financial incentives to the mix. Cryptocurrencies—coins and tokens built into specific blockchains—provide a way to incentivize individuals and groups to participate in, maintain, and build services.</p>\n<p>The idea that an internet service could have an associated coin or token may be a novel concept, but the blockchain and cryptocurrencies can do for cloud-based services what open source did for software. It took twenty years for open source software to supplant proprietary software, and it could take just as long for open services to supplant proprietary services. But the benefits of such a shift will be immense. Instead of placing our trust in corporations, we can place our trust in community-owned and -operated software, transforming the internet’s governing principle from “don’t be evil” back to “can’t be evil.”</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2018/02/18/why-decentralization-matters/",
"title": "Why decentralization matters",
"description": "The first two eras of the internet During the first era of the internet — from the 1980s through the early 2000s — internet services were built on open protocols ...",
"url": "https://cdixon.org/2018/02/18/why-decentralization-matters/",
"published": "2018-02-18T00:00:00.000Z",
"updated": "2018-02-18T00:00:00.000Z",
"content": "<h2>The first two eras of the internet</h2>\n<p>During the first era of the internet — from the 1980s through the early 2000s — internet services were built on open protocols that were controlled by the internet community. This meant that people or organizations could grow their internet presence knowing the rules of the game wouldn’t change later on. Huge web properties were started during this era including Yahoo, Google, Amazon, Facebook, LinkedIn, and YouTube. In the process, the importance of centralized platforms like AOL greatly diminished.</p>\n<p>During the second era of the internet, from the mid 2000s to the present, for-profit tech companies — most notably Google, Apple, Facebook, and Amazon (GAFA) — built software and services that rapidly outpaced the capabilities of open protocols. The explosive growth of smartphones accelerated this trend as mobile apps became the majority of internet use. Eventually users migrated from open services to these more sophisticated, centralized services. Even when users still accessed open protocols like the web, they would typically do so mediated by GAFA software and services.</p>\n<p>The good news is that billions of people got access to amazing technologies, many of which were free to use. The bad news is that it became much harder for startups, creators, and other groups to grow their internet presence without worrying about centralized platforms changing the rules on them, taking away their audiences and profits. This in turn stifled innovation, making the internet less interesting and dynamic. Centralization has also created broader societal tensions, which we see in the debates over subjects like fake news, state sponsored bots, “no platforming” of users, EU privacy laws, and algorithmic biases. These debates will only intensify in the coming years.</p>\n<h2>“Web 3”: the third era of the internet</h2>\n<p>One response to this centralization is to impose government regulation on large internet companies. This response assumes that the internet is similar to past communication networks like the phone, radio, and TV networks. But the hardware-based networks of the past are fundamentally different than the internet, a software-based network. Once hardware-based networks are built, they are nearly impossible to rearchitect. Software-based networks can be rearchitected through entrepreneurial innovation and market forces.</p>\n<p>The internet is the ultimate software-based network, consisting of a relatively simple <a href=\"https://en.wikipedia.org/wiki/Internet_Protocol\">core layer</a> connecting billions of fully programmable computers at the edge. Software is simply the encoding of human thought, and as such has an almost unbounded design space. Computers connected to the internet are, by and large, free to run whatever software their owners choose. Whatever can be dreamt up, with the right set of incentives, can quickly propagate across the internet. Internet architecture is where technical creativity and incentive design intersect.</p>\n<p>The internet is still early in its evolution: the core internet services will likely be almost entirely rearchitected in the coming decades. This will be enabled by crypto-economic networks, a generalization of the ideas first introduced in <a href=\"https://bitcoin.org/bitcoin.pdf\">Bitcoin</a> and further developed in <a href=\"https://github.com/ethereum/wiki/wiki/White-Paper\">Ethereum</a>. Cryptonetworks combine the best features of the first two internet eras: community-governed, decentralized networks with capabilities that will eventually exceed those of the most advanced centralized services.</p>\n<h2>Why decentralization?</h2>\n<p>Decentralization is a commonly misunderstood concept. For example, it is sometimes said that the reason cryptonetwork advocates favor decentralization is to resist government censorship, or because of libertarian political views. These are not the main reasons decentralization is important.</p>\n<p>Let’s look at the problems with centralized platforms. Centralized platforms follow a predictable life cycle. When they start out, they do everything they can to recruit users and 3rd-party complements like developers, businesses, and media organizations. They do this to make their services more valuable, as platforms (by definition) are systems with multi-sided network effects. As platforms move up the adoption S-curve, their power over users and 3rd parties steadily grows.</p>\n<p><img src=\"images/07lrwGIDbAYk6q7zG.png\" alt=\"\"></p>\n<p>When they hit the top of the S-curve, their relationships with network participants change from positive-sum to zero-sum. The easiest way to continue growing lies in extracting data from users and competing with complements over audiences and profits. Historical examples of this are Microsoft vs. Netscape, Google vs. Yelp, Facebook vs. Zynga, and Twitter vs. its 3rd-party clients. Operating systems like iOS and Android have behaved better, although still take a healthy 30% tax, reject apps for seemingly arbitrary reasons, and subsume the functionality of 3rd-party apps at will.</p>\n<p>For 3rd parties, this transition from cooperation to competition feels like a bait-and-switch. Over time, the best entrepreneurs, developers, and investors have become wary of building on top of centralized platforms. We now have decades of evidence that doing so will end in disappointment. In addition, users give up privacy, control of their data, and become vulnerable to security breaches. These problems with centralized platforms will likely become even more pronounced in the future.</p>\n<h2>Enter cryptonetworks</h2>\n<p>Cryptonetworks are networks built on top of the internet that 1) use consensus mechanisms such as blockchains to maintain and update state, 2) use cryptocurrencies (coins/tokens) to incentivize consensus participants (miners/validators) and other network participants. Some cryptonetworks, such as Ethereum, are general programming platforms that can be used for almost any purpose. Other cryptonetworks are special purpose, for example Bitcoin is intended primarily for storing value, <a href=\"https://golem.network/\">Golem</a> for performing computations, and <a href=\"https://filecoin.io/\">Filecoin</a> for decentralized file storage.</p>\n<p>Early internet protocols were technical specifications created by working groups or non-profit organizations that relied on the alignment of interests in the internet community to gain adoption. This method worked well during the very early stages of the internet but since the early 1990s very few new protocols have gained widespread adoption. <a href=\"2017/05/27/crypto-tokens-a-breakthrough-in-open-network-design\">Cryptonetworks fix</a> these problems by providing economics incentives to developers, maintainers, and other network participants in the form of tokens. They are also much more technically robust. For example, they are able to keep state and do arbitrary transformations on that state, something past protocols could never do.</p>\n<p>Cryptonetworks use multiple mechanisms to ensure that they stay neutral as they grow, preventing the bait-and-switch of centralized platforms. First, the contract between cryptonetworks and their participants is enforced in open source code. Second, they are kept in check through mechanisms for <a href=\"https://en.wikipedia.org/wiki/Exit,_Voice,_and_Loyalty\">“voice” and “exit.”</a> Participants are given voice through community governance, both “on chain” (via the protocol) and “off chain” (via the social structures around the protocol). Participants can exit either by leaving the network and selling their coins, or in the extreme case by forking the protocol.</p>\n<p>In short, cryptonetworks align network participants to work together toward a common goal — the growth of the network and the appreciation of the token. This alignment is one of the main reasons Bitcoin continues to defy skeptics and flourish, even while new cryptonetworks like Ethereum have grown alongside it.</p>\n<p>Today’s cryptonetworks suffer from limitations that keep them from seriously challenging centralized incumbents. The most severe limitations are around performance and scalability. The next few years will be about fixing these limitations and building networks that form the infrastructure layer of the crypto stack. After that, most of the energy will turn to building applications on top of that infrastructure.</p>\n<h2>How decentralization wins</h2>\n<p>It’s one thing to say decentralized networks should win, and another thing to say they will win. Let’s look at specific reasons to be optimistic about this.</p>\n<p>Software and web services are built by developers. There are millions of highly skilled developers in the world. Only a small fraction work at large technology companies, and only a small fraction of those work on new product development. Many of the most important software projects in history were created by startups or by communities of independent developers.</p>\n<blockquote>\n<p>“No matter who you are, most of the smartest people work for someone else.” — <a href=\"https://en.wikipedia.org/wiki/Joy%27s_law_(management)\">Bill Joy</a></p>\n</blockquote>\n<p>Decentralized networks can win the third era of the internet for the same reason they won the first era: by winning the hearts and minds of entrepreneurs and developers.</p>\n<p>An illustrative analogy is the rivalry in the 2000s between Wikipedia and its centralized competitors like Encarta. If you compared the two products in the early 2000s, Encarta was a far better product, with better topic coverage and higher accuracy. But Wikipedia improved at a much faster rate, because it had an active community of volunteer contributors who were attracted to its decentralized, community-governed ethos. By 2005, Wikipedia was the most <a href=\"https://medium.com/@cdixon/it-s-hard-to-believe-today-but-10-years-ago-wikipedia-was-widely-considered-a-doomed-experiment-a7a0dfd27b8b\">popular</a> reference site on the internet. Encarta was shut down in 2009.</p>\n<p>The lesson is that when you compare centralized and decentralized systems you need to consider them dynamically, as processes, instead of statically, as rigid products. Centralized systems often start out fully baked, but only get better at the rate at which employees at the sponsoring company improve them. Decentralized systems start out half-baked but, under the right conditions, grow exponentially as they attract new contributors.</p>\n<p>In the case of cryptonetworks, there are multiple, compounding feedback loops involving developers of the core protocol, developers of complementary cryptonetworks, developers of 3rd party applications, and service providers who operate the network. These feedback loops are further amplified by the incentives of the associated token, which — as we’ve seen with Bitcoin and Ethereum — can supercharge the rate at which crypto communities develop (and sometimes lead to negative outcomes, as with the excessive electricity consumed by Bitcoin mining).</p>\n<p>The question of whether decentralized or centralized systems will win the next era of the internet reduces to who will build the most compelling products, which in turn reduces to who will get more high quality developers and entrepreneurs on their side. GAFA has many advantages, including cash reserves, large user bases, and operational infrastructure. Cryptonetworks have a significantly more attractive value proposition to developers and entrepreneurs. If they can win their hearts and minds, they can mobilize far more resources than GAFA, and rapidly outpace their product development.</p>\n<blockquote>\n<p>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are linked using hypertext.” — <a href=\"http://farmerandfarmer.org/mastery/builder.html\">Farmer & Farmer</a></p>\n</blockquote>\n<p>Centralized platforms often come bundled at launch with compelling apps: Facebook had its core socializing features and the iPhone had a number of key apps. Decentralized platforms, by contrast, often launch half-baked and without clear use cases. As a result, they need to go through two phases of product-market fit: 1) product-market fit between the platform and the developers/entrepreneurs who will finish the platform and build out the ecosystem, and 2) product-market fit between the platform/ecosystem and end users. This two-stage process is what causes many people — including sophisticated technologists — to consistently underestimate the potential of decentralized platforms.</p>\n<h2>The next era of the internet</h2>\n<p>Decentralized networks aren’t a silver bullet that will fix all the problems on the internet. But they offer a much better approach than centralized systems.</p>\n<p>Compare the problem of Twitter spam to the problem of email spam. Since Twitter <a href=\"https://www.theverge.com/2012/8/23/3263481/twitter-api-third-party-developers\">closed</a> their network to 3rd-party developers, the only company working on Twitter spam has been Twitter itself. By contrast, there were hundreds of companies that tried to fight email spam, financed by billions of dollars in venture capital and corporate funding. Email spam isn’t solved, but it’s a lot better now, because 3rd parties knew that the <a href=\"https://en.wikipedia.org/wiki/Simple_Mail_Transfer_Protocol\">email protocol</a> was decentralized, so they could build businesses on top of it without worrying about the rules of the game changing later on.</p>\n<p>Or consider the problem of network governance. Today, unaccountable groups of employees at large platforms decide how information gets ranked and filtered, which users get promoted and which get banned, and other important governance decisions. In cryptonetworks, these decisions are made by the community, using open and transparent mechanisms. As we know from the offline world, democratic systems aren’t perfect, but they are a lot better than the alternatives.</p>\n<p>Centralized platforms have been dominant for so long that many people have forgotten there is a better way to build internet services. Cryptonetworks are a powerful way to develop community-owned networks and provide a level playing field for 3rd-party developers, creators, and businesses. We saw the value of decentralized systems in the first era of the internet. Hopefully we’ll get to see it again in the next.</p>\n<p><em>Originally published on <a href=\"https://medium.com/s/story/why-decentralization-matters-5e3f79f7638e\">Medium</a>.</em></p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2017/05/27/crypto-tokens-a-breakthrough-in-open-network-design/",
"title": "Tokens: A Breakthrough in Open Network Design",
"description": "It is a wonderful accident of history that the internet and web were created as open platforms that anyone — users, developers, organizations — could access equally. Among other things, ...",
"url": "https://cdixon.org/2017/05/27/crypto-tokens-a-breakthrough-in-open-network-design/",
"published": "2017-05-27T00:00:00.000Z",
"updated": "2017-05-27T00:00:00.000Z",
"content": "<p>It is a wonderful accident of history that the internet and web were created as open platforms that anyone — users, developers, organizations — could access equally. Among other things, this allowed independent developers to build products that quickly gained widespread adoption. Google started in a Menlo Park garage and Facebook started in a Harvard dorm room. They competed on a level playing field because they were built on decentralized networks governed by open protocols.</p>\n<p>Today, tech companies like Facebook, Google, Amazon, and Apple are <a href=\"https://medium.com/@cdixon/the-internet-economy-fc43f3eff58a\">stronger</a> than ever, whether measured by <a href=\"http://www.visualcapitalist.com/chart-largest-companies-market-cap-15-years/\">market cap</a>, share of top mobile apps, or pretty much any other common measure.</p>\n<p><img src=\"images/11LduvqPVCAVsy-rQ2qlhvg.png\" alt=\"Big 4 tech companies dominate smartphone apps (source); while their market caps continue to rise (source)\"></p>\n<p>These companies also control massive proprietary developer platforms. The dominant operating systems — iOS and Android — charge 30% payment fees and exert heavy influence over app distribution. The dominant social networks tightly restrict access, hindering the ability of third-party developers to scale. Startups and independent developers are increasingly competing from a disadvantaged position.</p>\n<p>A potential way to reverse this trend are <a href=\"http://continuations.com/post/148098927445/crypto-tokens-and-the-coming-age-of-protocol\">crypto tokens</a> — a new way to design open networks that arose from the cryptocurrency movement that began with the introduction of Bitcoin in 2008 and accelerated with the introduction of Ethereum in 2014. Tokens are a breakthrough in open network design that enable: 1) the creation of open, decentralized networks that combine the best architectural properties of open and proprietary networks, and 2) new ways to incentivize open network participants, including users, developers, investors, and service providers. By enabling the development of new open networks, tokens could help reverse the centralization of the internet, thereby keeping it accessible, vibrant and fair, and resulting in greater innovation.</p>\n<h2>Crypto tokens: unbundling Bitcoin</h2>\n<p>Bitcoin was introduced in 2008 with the publication of <a href=\"https://en.wikipedia.org/wiki/Satoshi_Nakamoto\">Satoshi Nakamoto’s</a> landmark <a href=\"https://bitcoin.org/bitcoin.pdf\">paper</a> that proposed a novel, decentralized payment system built on an underlying technology now known as a <a href=\"https://en.wikipedia.org/wiki/Blockchain\">blockchain</a>. Most fans of Bitcoin (including <a href=\"/2013/12/31/why-im-interested-in-bitcoin/\">me</a>) mistakenly thought Bitcoin was solely a breakthrough in financial technology. (It was easy to make this mistake: Nakamoto himself called it a “p2p payment system.”)</p>\n<p><img src=\"images/1MQ68XZTGHQG7E6ut5UimEw.jpeg\" alt=\"2009: Satoshi Nakamoto’s (post) announcing Bitcoin\"></p>\n<p>In retrospect, Bitcoin was really two innovations: 1) a <a href=\"https://en.wikipedia.org/wiki/Store_of_value\">store of value</a> for people who wanted an alternative to the existing financial system, and 2) a new way to develop open networks. Tokens unbundle the latter innovation from the former, providing a general method for designing and growing open networks.</p>\n<p>Networks — computing networks, developer platforms, marketplaces, social networks, etc — have always been a powerful part of the promise of the internet. Tens of thousands of networks have been incubated by developers and entrepreneurs, yet only a very small percentage of those have survived, and most of those were owned and controlled by private companies. The current state of the art of network development is very crude. It often involves raising money (venture capital is a common source of funding) and then spending it on paid marketing and other channels to overcome the “bootstrap problem” — the problem that networks tend to only become useful when they reach a critical mass of users. In the rare cases where networks succeed, the financial returns tend to accrue to the relatively small number of people who own equity in the network. Tokens offer a better way.</p>\n<p>Ethereum, introduced in 2014 and launched in 2015, was the first major non-Bitcoin token network. The lead developer, <a href=\"https://a16z.com/2016/08/28/ethereum/\">Vitalik Buterin</a>, had previously tried to create smart contract languages on top of the Bitcoin blockchain. Eventually he realized that (by design, mostly) Bitcoin was too limited, so a new approach was needed.</p>\n<p><img src=\"images/1Crmcqo6mdF1okzHt4Bdp4g.png\" alt=\"2014: Vitalik Buterin’s (forum post) announcing Ethereum\"></p>\n<p>Ethereum is a network that allows developers to run “smart contracts” — snippets of <a href=\"https://en.wikipedia.org/wiki/Ethereum#Smart_contracts\">code</a> submitted by developers that are executed by a distributed network of computers. Ethereum has a corresponding token called Ether that can be purchased, either to hold for financial purposes or to use by purchasing computing power (known as “<a href=\"https://ethereum.stackexchange.com/questions/3/what-is-gas-and-transaction-fee-in-ethereum\">gas</a>”) on the network. Tokens are also given out to “miners” which are the computers on the decentralized network that execute smart contract code (you can think of miners as playing the role of cloud hosting services like <a href=\"https://en.wikipedia.org/wiki/Amazon_Web_Services\">AWS</a>). Third-party developers can write their own <a href=\"https://dapps.ethercasts.com/\">applications</a> that live on the network, and can charge Ether to generate revenue.</p>\n<p>Ethereum is inspiring a new wave of token networks. (It also provided a simple way for new token networks to launch on top of the Ethereum network, using a standard known as <a href=\"https://github.com/ethereum/EIPs/issues/20\">ERC20</a>). Developers are building token networks for a wide range of use cases, including distributed <a href=\"http://filecoin.io/\">computing</a> <a href=\"https://golem.network/\">platforms</a>, <a href=\"https://augur.net/\">prediction</a> and financial markets, incentivized <a href=\"https://steem.io/\">content creation networks</a>, and <a href=\"https://basicattentiontoken.org/\">attention and advertising networks</a>. Many more networks will be invented and launched in the coming months and years.</p>\n<p>Below I walk through the two main benefits of the token model, the first architectural and the second involving incentives.</p>\n<h2>Tokens enable the management and financing of open services</h2>\n<p>Proponents of open systems never had an effective way to manage and fund operating services, leading to a significant architectural disadvantage compared to their proprietary counterparts. This was particularly evident during the last internet mega-battle between open and closed networks: the social wars of the late 2000s. As Alexis Madrigal recently <a href=\"https://www.theatlantic.com/technology/archive/2017/05/a-very-brief-history-of-the-last-10-years-in-technology/526767/?utm_source=atltw\">wrote</a>, back in 2007 it looked like open networks would dominate going forward:</p>\n<blockquote>\n<p>In 2007, the web people were triumphant. Sure, the dot-com boom had busted, but empires were being built out of the remnant swivel chairs and fiber optic cables and unemployed developers. Web 2.0 was not just a temporal description, but an ethos. The web would be open. A myriad of services would be built, communicating through APIs, to provide the overall internet experience.</p>\n</blockquote>\n<p>But with the launch of the iPhone and the rise of smartphones, proprietary networks quickly won out:</p>\n<blockquote>\n<p>As that world-historical explosion began, a platform war came with it. The Open Web lost out quickly and decisively. By 2013, Americans spent about as much of their time on their phones <a href=\"http://www.marketingcharts.com/online/smart-device-users-spend-as-much-time-on-facebook-as-the-mobile-web-28422/\">looking at Facebook</a> as they did the whole rest of the open web.</p>\n</blockquote>\n<p>Why did open social protocols get so decisively defeated by proprietary social networks? The rise of smartphones was only part of the story. Some open protocols — like email and the web — survived the transition to the mobile era. Open protocols relating to social networks were high quality and abundant (e.g. <a href=\"https://en.wikipedia.org/wiki/RSS\">RSS</a>, <a href=\"http://xmlns.com/foaf/spec/\">FOAF</a>, <a href=\"https://en.wikipedia.org/wiki/XHTML_Friends_Network\">XFN</a>, <a href=\"http://openid.net/\">OpenID</a>). What the open side lacked was a mechanism for encapsulating software, databases, and protocols together into easy-to-use services.</p>\n<p>For example, in 2007, Wired magazine ran an <a href=\"https://www.wired.com/2007/08/open-social-net/\">article</a> in which they tried to create their own social network using open tools:</p>\n<blockquote>\n<p>For the last couple of weeks, Wired News tried to roll its own Facebook using free web tools and widgets. We came close, but we ultimately failed. We were able to recreate maybe 90 percent of Facebook’s functionality, but not the most important part — a way to link people and declare the nature of the relationship.</p>\n</blockquote>\n<p>Some developers <a href=\"http://bradfitz.com/social-graph-problem/\">proposed</a> solving this problem by creating a database of social graphs run by a non-profit organization:</p>\n<blockquote>\n<p><strong>Establish a non-profit and open source software</strong> (with copyrights held by the non-profit) which collects, merges, and redistributes the graphs from all other social network sites into one global aggregated graph. This is then made available to other sites (or users) via both public APIs (for small/casual users) and downloadable data dumps, with an update stream / APIs, to get iterative updates to the graph (for larger users).</p>\n</blockquote>\n<p>These open schemes required widespread coordination among standards bodies, server operators, app developers, and sponsoring organizations to mimic the functionality that proprietary services could provide all by themselves. As a result, proprietary services were able to create better user experiences and iterate much faster. This led to faster growth, which in turn led to greater investment and revenue, which then fed back into product development and further growth. Thus began a flywheel that drove the meteoric rise of proprietary social networks like Facebook and Twitter.</p>\n<p>Had the token model for network development existed back in 2007, the playing field would have been much more level. First, tokens provide a way not only to define a protocol, but to fund the operating expenses required to host it as a service. Bitcoin and Ethereum have tens of thousands of servers around the world (“miners”) that run their networks. They cover the hosting costs with built-in mechanisms that automatically distribute token rewards to computers on the network (“mining rewards”).</p>\n<p><img src=\"images/1-lu1cuJeeDIFPsDpPPo8lw.png\" alt=\"There are over 20,000 Ethereum nodes around the world (source)\"></p>\n<p>Second, tokens provide a model for creating shared computing resources (<a href=\"https://medium.com/@FEhrsam/the-dapp-developer-stack-the-blockchain-industry-barometer-8d55ec1c7d4\">including</a> databases, compute, and file storage) while keeping the control of those resources decentralized (and without requiring an organization to maintain them). This is the blockchain technology that has been talked about <a href=\"https://trends.google.com/trends/explore?q=blockchain\">so much</a>. Blockchains would have allowed shared social graphs to be stored on a decentralized network. It would have been easy for the Wired author to create an open social network using the tools available today.</p>\n<h2>Tokens align incentives among network participants</h2>\n<p>Some of the <a href=\"/2009/09/14/the-inevitable-showdown-between-twitter-and-twitter-apps/\">fiercest battles</a> in tech are between <a href=\"https://en.wikipedia.org/wiki/Complementary_good\">complements</a>. There were, for example, hundreds of startups that tried to build businesses on the APIs of social networks only to have the terms change later on, forcing them to pivot or shut down. Microsoft’s battles with complements like Netscape and Intuit are legendary. Battles within ecosystems are so common and drain so much energy that business books are full of frameworks for how one company can squeeze profits from adjacent businesses (e.g. Porter’s <a href=\"https://en.wikipedia.org/wiki/Porter%27s_five_forces_analysis\">five forces</a> model).</p>\n<p>Token networks remove this friction by aligning network participants to work together toward a common goal— the growth of the network and the appreciation of the token. This alignment is one of the main reasons Bitcoin continues to defy <a href=\"https://99bitcoins.com/bitcoinobituaries/\">skeptics</a> and flourish, even while new token networks like Ethereum have grown along side it.</p>\n<p>Moreover, well-designed token networks include an efficient mechanism to incentivize network participants to overcome the bootstrap problem that bedevils traditional network development. For example, <a href=\"https://steemit.com/\">Steemit</a> is a decentralized Reddit-like token network that makes payments to users who post and upvote articles. When Steemit launched last year, the community was <a href=\"https://coinreport.net/social-network-steemit-distributes-1-3-million-first-cryptocurrency-payout-users/\">pleasantly surprised</a> when they made their first significant payout to users.</p>\n<p><img src=\"images/1mi0v6PNlGnjL9QH-AWZxAA.png\" alt=\"Tokens help overcome the bootstrap problem by adding financial utility when application utility is low\"></p>\n<p>This in turn led to the appreciation of Steemit tokens, which increased future payouts, leading to a <a href=\"https://www.usv.com/blog/fat-protocols\">virtuous cycle</a> where more users led to more investment, and vice versa. Steemit is still a beta project and has since had mixed results, but was an interesting experiment in how to generalize the mutually reinforcing interaction between users and investors that Bitcoin and Ethereum first demonstrated.</p>\n<p>A lot of attention has been paid to token pre-sales (so-called “ICOs”), but they are just one of multiple ways in which the token model innovates on network incentives. A well-designed token network carefully manages the distribution of tokens across all five groups of network participants (users, core developers, third-party developers, investors, service providers) to maximize the growth of the network.</p>\n<p>One way to think about the token model is to imagine if the internet and web hadn’t been funded by governments and universities, but instead by a company that raised money by selling off domain names. People could buy domain names either to use them or as an investment (collectively, domain names are worth tens of billions of dollars today). Similarly, domain names could have been given out as rewards to service providers who agreed to run hosting services, and to third-party developers who supported the network. This would have provided an alternative way to finance and accelerate the development of the internet while also aligning the incentives of the various network participants.</p>\n<h2>The open network movement</h2>\n<p>The cryptocurrency movement is the spiritual heir to previous open computing movements, including the open source software movement led most visibly by Linux, and the open information movement led most visibly by Wikipedia.</p>\n<p><img src=\"images/1U0B5FlpNVXSXeIcqodktLQ.png\" alt=\"1991: Linus Torvalds’ forum (post) announcing Linux; 2001: the first Wikipedia (page)\"></p>\n<p>Both of these movements were once niche and <a href=\"https://medium.com/@cdixon/it-s-hard-to-believe-today-but-10-years-ago-wikipedia-was-widely-considered-a-doomed-experiment-a7a0dfd27b8b\">controversial</a>. Today Linux is the dominant worldwide operating system, and Wikipedia is the most popular informational website in the world.</p>\n<p>Crypto tokens are currently niche and controversial. If present trends continue, they will soon be seen as a breakthrough in the design and development of open networks, combining the societal benefits of open protocols with the financial and architectural benefits of proprietary networks. They are also an extremely promising development for those hoping to keep the internet accessible to entrepreneurs, developers, and other independent creators.</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2017/02/20/aristotle-computer/",
"title": "How Aristotle Created the Computer",
"description": "The philosophers he influenced set the stage for the technological revolution that remade our world. Originally published by The Atlantic. The history of computers is often told as a history ...",
"url": "https://cdixon.org/2017/02/20/aristotle-computer/",
"published": "2017-02-20T00:00:00.000Z",
"updated": "2017-02-20T00:00:00.000Z",
"content": "<h2>The philosophers he influenced set the stage for the technological revolution that remade our world.</h2>\n<p><em>Originally published by <a href=\"https://www.theatlantic.com/technology/archive/2017/03/aristotle-computer/518697/\">The Atlantic</a>.</em></p>\n<p>The history of computers is often told as a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II. In fact, it is better understood as a history of ideas, mainly ideas that emerged from mathematical logic, an obscure and cult-like discipline that first developed in the 19th century. Mathematical logic was pioneered by philosopher-mathematicians, most notably George Boole and Gottlob Frege, who were themselves inspired by Leibniz’s dream of a universal “concept language,” and the ancient logical system of Aristotle.</p>\n<p>Mathematical logic was initially considered a hopelessly abstract subject with no conceivable applications. As one computer scientist <a href=\"http://bactra.org/notebooks/mathematical-logic.html\">commented</a>: “If, in 1901, a talented and sympathetic outsider had been called upon to survey the sciences and name the branch which would be least fruitful in [the] century ahead, his choice might well have settled upon mathematical logic.” And yet, it would provide the foundation for a field that would have more impact on the modern world than any other.</p>\n<p>The evolution of computer science from mathematical logic culminated in the 1930s, with two landmark papers: Claude Shannon’s “<a href=\"http://www.ccapitalia.net/descarga/docs/1938-shannon-analysis-relay-switching-circuits.pdf\">A Symbolic Analysis of Switching and Relay Circuits</a>,” and Alan Turing’s “<a href=\"http://www.dna.caltech.edu/courses/cs129/caltech_restricted/Turing_1936_IBID.pdf\">On Computable Numbers, With an Application to the <em>Entscheidungsproblem</em></a>.” In the history of computer science, Shannon and Turing are towering figures, but the importance of the philosophers and logicians who preceded them is frequently overlooked.</p>\n<p>A well-known history of computer science describes Shannon’s paper as “possibly the most important, and also the most noted, master’s thesis of the century.” Shannon wrote it as an electrical engineering student at MIT. His adviser, Vannevar Bush, built a prototype computer known as the <a href=\"http://www.mit.edu/~klund/analyzer/\">Differential Analyzer</a> that could rapidly calculate differential equations. The device was mostly mechanical, with subsystems controlled by electrical relays, which were organized in an ad hoc manner as there was not yet a systematic theory underlying circuit design. Shannon’s thesis topic came about when Bush recommended he try to discover such a theory.</p>\n<p>Shannon’s paper is in many ways a typical electrical-engineering paper, filled with equations and diagrams of electrical circuits. What is unusual is that the primary reference was a 90-year-old work of mathematical philosophy, George Boole’s <em>The Laws of Thought</em>.</p>\n<p>Today, Boole’s name is well known to computer scientists (many programming languages have a basic data type called a Boolean), but in 1938 he was rarely read outside of philosophy departments. Shannon himself encountered Boole’s work in an undergraduate philosophy class. “It just happened that no one else was familiar with both fields at the same time,” he <a href=\"http://georgeboole.com/boole/legacy/engineering/\">commented</a> later.</p>\n<p>Boole is often described as a mathematician, but he saw himself as a philosopher, following in the footsteps of Aristotle. The Laws of Thought begins with a description of his goals, to investigate the fundamental laws of the operation of the human mind:</p>\n<blockquote>\n<p>The design of the following treatise is to investigate the fundamental laws of those operations of the mind by which reasoning is performed; to give expression to them in the symbolical language of a Calculus, and upon this foundation to establish the science of Logic … and, finally, to collect … some probable intimations concerning the nature and constitution of the human mind.</p>\n</blockquote>\n<p>He then pays tribute to Aristotle, the inventor of logic, and the primary influence on <a href=\"http://www.gutenberg.org/files/15114/15114-pdf.pdf\">his own work</a>:</p>\n<blockquote>\n<p>In its ancient and scholastic form, indeed, the subject of Logic stands almost exclusively associated with the great name of Aristotle. As it was presented to ancient Greece in the partly technical, partly metaphysical disquisitions of The Organon, such, with scarcely any essential change, it has continued to the present day.</p>\n</blockquote>\n<p>Trying to improve on the logical work of Aristotle was an intellectually daring move. Aristotle’s logic, presented in his six-part book <em>The Organon</em>, occupied a central place in the scholarly canon for more than 2,000 years. It was widely believed that Aristotle had written almost all there was to say on the topic. The great philosopher Immanuel Kant <a href=\"https://books.google.com/books?id=WJVYp0C0taYC&pg=PA36&lpg=PA36&dq=unable+to+take+a+single+step+forward,+and+therefore+seems+to+all+appearance+to+be+finished+and+complete&source=bl&ots=W4Lrt9I80J&sig=KpZlOd-Yc9brgTksIJJZcxUD-Mg&hl=en&sa=X&ved=0ahUKEwjeg8i1iLvQAhVH6IMKHTMXDMgQ6AEIHTAA#v=onepage&q=unable%20to%20take%20a%20single%20step%20forward%2C%20and%20therefore%20seems%20to%20all%20appearance%20to%20be%20finished%20and%20complete&f=false\">commented</a> that, since Aristotle, logic had been “unable to take a single step forward, and therefore seems to all appearance to be finished and complete.”</p>\n<p>Aristotle’s central observation was that arguments were valid or not based on their logical structure, independent of the non-logical words involved. The most famous argument schema he discussed is known as the syllogism:</p>\n<ul>\n<li>All men are mortal.</li>\n<li>Socrates is a man.</li>\n<li>Therefore, Socrates is mortal.</li>\n</ul>\n<p>You can replace “Socrates” with any other object, and “mortal” with any other predicate, and the argument remains valid. The validity of the argument is determined solely by the logical structure. The logical words — “all,” “is,” are,” and “therefore” — are doing all the work.</p>\n<p>Aristotle also defined a set of basic axioms from which he derived the rest of his logical system:</p>\n<ul>\n<li>An object is what it is (Law of Identity)</li>\n<li>No statement can be both true and false (Law of Non-contradiction)</li>\n<li>Every statement is either true or false (Law of the Excluded Middle)</li>\n</ul>\n<p>These axioms weren’t meant to describe how people actually think (that would be the realm of psychology), but how an idealized, perfectly rational person ought to think.</p>\n<p>Aristotle’s axiomatic method influenced an even more famous book, Euclid’s <em>Elements</em>, which is <a href=\"https://en.wikipedia.org/wiki/Euclid%27s_Elements\">estimated</a> to be second only to the Bible in the number of editions printed.</p>\n<p><img src=\"images/2c8ad9d68.png\" alt=\"A fragment of the Elements (Wikimedia Commons)\"></p>\n<p>Although ostensibly about geometry, the <em>Elements</em> became a standard textbook for teaching rigorous deductive reasoning. (Abraham Lincoln once said that he learned sound legal argumentation from studying Euclid.) In Euclid’s system, geometric ideas were represented as spatial diagrams. Geometry continued to be practiced this way until René Descartes, in the 1630s, showed that geometry could instead be represented as formulas. His <em>Discourse on Method</em> was the <a href=\"http://www.storyofmathematics.com/17th_descartes.html\">first</a> mathematics text in the West to popularize what is now standard algebraic notation — x, y, z for variables, a, b, c for known quantities, and so on.</p>\n<p>Descartes’s algebra allowed mathematicians to move beyond spatial intuitions to manipulate symbols using precisely defined formal rules. This shifted the dominant mode of mathematics from diagrams to formulas, leading to, among other things, the development of calculus, invented roughly 30 years after Descartes by, independently, Isaac Newton and Gottfried Leibniz.</p>\n<p>Boole’s goal was to do for Aristotelean logic what Descartes had done for Euclidean geometry: free it from the limits of human intuition by giving it a precise algebraic notation. To give a simple example, when Aristotle wrote:</p>\n<p>All men are mortal.</p>\n<p>Boole replaced the words “men” and “mortal” with variables, and the logical words “all” and “are” with arithmetical operators:</p>\n<p><em>x = x * y</em></p>\n<p>Which could be interpreted as “Everything in the set <em>x</em> is also in the set <em>y</em>.”</p>\n<p>The <em>Laws of Thought</em> created a new scholarly field—mathematical logic—which in the following years became one of the most active areas of research for mathematicians and philosophers. Bertrand Russell called the <em>Laws of Thought</em> “the work in which pure mathematics was discovered.”</p>\n<p>Shannon’s insight was that Boole’s system could be mapped directly onto electrical circuits. At the time, electrical circuits had no systematic theory governing their design. Shannon realized that the right theory would be “exactly analogous to the calculus of propositions used in the symbolic study of logic.”</p>\n<p>He showed the correspondence between electrical circuits and Boolean operations in a simple chart:</p>\n<p><img src=\"images/99df968e4.png\" alt=\"Shannon’s mapping from electrical circuits to symbolic logic (University of Virginia)\"></p>\n<p>This correspondence allowed computer scientists to import decades of work in logic and mathematics by Boole and subsequent logicians. In the second half of his paper, Shannon showed how Boolean logic could be used to create a circuit for adding two binary digits.</p>\n<p>By stringing these adder circuits together, arbitrarily complex arithmetical operations could be constructed. These circuits would become the basic building blocks of what are now known as <a href=\"https://en.wikipedia.org/wiki/Arithmetic_logic_unit\">arithmetical logic units</a>, a key component in modern computers.</p>\n<p><img src=\"images/2b88e5a1a.png\" alt=\"Shannon’s adder circuit (University of Virginia)\"></p>\n<p>Another way to characterize Shannon’s achievement is that he was first to distinguish between the logical and the physical layer of computers. (This distinction has become so fundamental to computer science that it might seem surprising to modern readers how insightful it was at the time—a reminder of the adage that “the philosophy of one century is the common sense of the next.”)</p>\n<p>Since Shannon’s paper, a vast amount of progress has been made on the physical layer of computers, including the invention of the transistor in 1947 by William Shockley and his colleagues at Bell Labs. Transistors are dramatically improved versions of Shannon’s electrical relays — the best known way to physically encode Boolean operations. Over the next 70 years, the semiconductor industry packed more and more transistors into smaller spaces. A 2016 iPhone <a href=\"http://www.macrumors.com/2016/09/12/cpu-improvements-iphone-7-apple-watch/\">has</a> about 3.3 billion transistors, each one a “relay switch” like those pictured in Shannon’s diagrams.</p>\n<p>While Shannon showed how to map logic onto the physical world, Turing showed how to design computers in the language of mathematical logic. When Turing wrote his paper, in 1936, he was trying to solve “the decision problem,” first identified by the mathematician David Hilbert, who asked whether there was an algorithm that could determine whether an arbitrary mathematical statement is true or false. In contrast to Shannon’s paper, Turing’s paper is highly technical. Its primary historical significance lies not in its answer to the decision problem, but in the template for computer design it provided along the way.</p>\n<p>Turing was working in a tradition stretching back to Gottfried Leibniz, the philosophical giant who developed calculus independently of Newton. Among Leibniz’s many contributions to modern thought, one of the most intriguing was the idea of a new language he called the “<a href=\"https://en.wikipedia.org/wiki/Characteristica_universalis\">universal characteristic</a>” that, he imagined, could represent all possible mathematical and scientific knowledge. Inspired in part by the 13th-century religious philosopher <a href=\"https://en.wikipedia.org/wiki/Ramon_Llull\">Ramon Llull</a>, Leibniz postulated that the language would be ideographic like Egyptian hieroglyphics, except characters would correspond to “atomic” concepts of math and science. He argued this language would give humankind an “instrument” that could enhance human reason “to a far greater extent than optical instruments” like the microscope and telescope.</p>\n<p>He also <a href=\"http://publicdomainreview.org/2016/11/10/let-us-calculate-leibniz-llull-and-computational-imagination/\">imagined</a> a machine that could process the language, which he called the calculus ratiocinator.</p>\n<blockquote>\n<p>If controversies were to arise, there would be no more need of disputation between two philosophers than between two accountants. For it would suffice to take their pencils in their hands, and say to each other: Calculemus—Let us calculate.</p>\n</blockquote>\n<p>Leibniz didn’t get the opportunity to develop his universal language or the corresponding machine (although he did invent a relatively simple calculating machine, the <a href=\"https://en.wikipedia.org/wiki/Stepped_reckoner\">stepped reckoner</a>). The first credible attempt to realize Leibniz’s dream came in 1879, when the German philosopher Gottlob Frege published his landmark logic treatise <em><a href=\"https://en.wikipedia.org/wiki/Begriffsschrift\">Begriffsschrift</a></em>. Inspired by Boole’s attempt to improve Aristotle’s logic, Frege developed a much more advanced logical system. The logic taught in philosophy and computer-science classes today—first-order or predicate logic—is only a slight modification of Frege’s system.</p>\n<p>Frege is generally considered one of the most important philosophers of the 19th century. Among other things, he is credited with catalyzing what noted philosopher Richard Rorty called the “<a href=\"https://en.wikipedia.org/wiki/Linguistic_turn\">linguistic turn</a>” in philosophy. As Enlightenment philosophy was obsessed with questions of knowledge, philosophy after Frege became obsessed with questions of language. His disciples included two of the most important philosophers of the 20th century—Bertrand Russell and Ludwig Wittgenstein.</p>\n<p>The major innovation of Frege’s logic is that it much more accurately represented the logical structure of ordinary language. Among other things, Frege was the first to use quantifiers (“for every,” “there exists”) and to separate objects from predicates. He was also the first to develop what today are fundamental concepts in computer science like recursive functions and variables with scope and binding.</p>\n<p>Frege’s formal language — what he called his “concept-script” — is made up of meaningless symbols that are manipulated by well-defined rules. The language is only given meaning by an interpretation, which is specified separately (this distinction would later come to be called syntax versus semantics). This turned logic into what the eminent computer scientists Allan Newell and Herbert Simon called “the symbol game,” “played with meaningless tokens according to certain purely syntactic rules.”</p>\n<blockquote>\n<p>All meaning had been purged. One had a mechanical system about which various things could be proved. Thus progress was first made by walking away from all that seemed relevant to meaning and human symbols.</p>\n</blockquote>\n<p>As Bertrand Russell famously quipped: “Mathematics may be defined as the subject in which we never know what we are talking about, nor whether what we are saying is true.”</p>\n<p>An unexpected consequence of Frege’s work was the discovery of weaknesses in the foundations of mathematics. For example, Euclid’s <em>Elements</em> — considered the gold standard of logical rigor for thousands of years — turned out to be full of logical mistakes. Because Euclid used ordinary words like “line” and “point,” he — and centuries of readers — deceived themselves into making assumptions about sentences that contained those words. To give one relatively simple example, in ordinary usage, the word “line” implies that if you are given three distinct points on a line, one point must be between the other two. But when you define “line” using formal logic, it turns out “between-ness” also needs to be defined—something Euclid overlooked. Formal logic makes gaps like this easy to spot.</p>\n<p>This realization created a <a href=\"https://en.wikipedia.org/wiki/Foundations_of_mathematics#Foundational_crisis\">crisis</a> in the foundation of mathematics. If the <em>Elements</em> — the bible of mathematics — contained logical mistakes, what other fields of mathematics did too? What about sciences like physics that were built on top of mathematics?</p>\n<p>The good news is that the same logical methods used to uncover these errors could also be used to correct them. Mathematicians started rebuilding the foundations of mathematics from the bottom up. In 1889, Giuseppe Peano <a href=\"https://en.wikipedia.org/wiki/Peano_axioms\">developed</a> axioms for arithmetic, and in 1899, David Hilbert <a href=\"https://en.wikipedia.org/wiki/Hilbert%27s_axioms\">did</a> the same for geometry. Hilbert also outlined a program to formalize the remainder of mathematics, with specific requirements that any such attempt should satisfy, including:</p>\n<ul>\n<li><em>Completeness</em>: There should be a proof that all true mathematical statements can be proved in the formal system.</li>\n<li><em>Decidability</em>: There should be an algorithm for deciding the truth or falsity of any mathematical statement. (This is the “<em>Entscheidungsproblem</em>” or “decision problem” referenced in Turing’s paper.)</li>\n</ul>\n<p>Rebuilding mathematics in a way that satisfied these requirements became known as Hilbert’s program. Up through the 1930s, this was the focus of a core group of logicians including Hilbert, Russell, Kurt Gödel, John Von Neumann, Alonzo Church, and, of course, Alan Turing.</p>\n<p>Hilbert’s program proceeded on at least two fronts. On the first front, logicians created logical systems that tried to prove Hilbert’s requirements either satisfiable or not.</p>\n<p>On the second front, mathematicians used logical concepts to rebuild classical mathematics. For example, Peano’s system for arithmetic starts with a simple function called the successor function which increases any number by one. He uses the successor function to recursively define <a href=\"https://en.wikipedia.org/wiki/Peano_axioms#Addition\">addition</a>, uses addition to recursively define <a href=\"https://en.wikipedia.org/wiki/Peano_axioms#Multiplication\">multiplication</a>, and so on, until all the operations of number theory are defined. He then uses those definitions, along with formal logic, to prove theorems about arithmetic.</p>\n<p>The historian Thomas Kuhn once observed that “in science, novelty emerges only with difficulty.” Logic in the era of Hilbert’s program was a tumultuous process of creation and destruction. One logician would build up an elaborate system and another would tear it down.</p>\n<p>The favored tool of destruction was the construction of self-referential, paradoxical statements that showed the axioms from which they were derived to be inconsistent. A simple form of this “liar’s paradox” is the sentence:</p>\n<p>This sentence is false.</p>\n<p>If it is true then it is false, and if it is false then it is true, leading to an endless loop of self-contradiction.</p>\n<p>Russell made the first notable use of the liar’s paradox in mathematical logic. He showed that Frege’s system allowed self-contradicting sets to be derived:</p>\n<blockquote>\n<p>Let <em>R</em> be the set of all sets that are not members of themselves. If <em>R</em> is not a member of itself, then its definition dictates that it must contain itself, and if it contains itself, then it contradicts its own definition as the set of all sets that are not members of themselves.</p>\n</blockquote>\n<p>This became known as Russell’s paradox and was seen as a serious flaw in Frege’s achievement. (Frege himself was shocked by this discovery. He replied to Russell: “Your discovery of the contradiction caused me the greatest surprise and, I would almost say, consternation, since it has shaken the basis on which I intended to build my arithmetic.”)</p>\n<p>Russell and his colleague Alfred North Whitehead put forth the most ambitious attempt to complete Hilbert’s program with the <em>Principia Mathematica</em>, published in three volumes between 1910 and 1913. The <em>Principia’s</em> method was so detailed that it took over 300 pages to get to the proof that 1+1=2.</p>\n<p>Russell and Whitehead tried to resolve Frege’s paradox by introducing what they called type theory. The idea was to partition formal languages into multiple levels or types. Each level could make reference to levels below, but not to their own or higher levels. This resolved self-referential paradoxes by, in effect, banning self-reference. (This solution was not popular with logicians, but it did influence computer science — most modern computer languages have features inspired by type theory.)</p>\n<p>Self-referential paradoxes ultimately showed that Hilbert’s program could never be successful. The first blow came in 1931, when Gödel published his now famous incompleteness theorem, which proved that any consistent logical system powerful enough to encompass arithmetic must also contain statements that are true but cannot be proven to be true. (Gödel’s incompleteness theorem is one of the few logical results that has been broadly popularized, thanks to books like <a href=\"https://en.wikipedia.org/wiki/G%C3%B6del,_Escher,_Bach\">Gödel, Escher, Bach</a> and <a href=\"https://www.amazon.com/dp/B00ARGXG7Q/ref=dp-kindle-redirect?_encoding=UTF8&btkr=1\">The Emperor’s New Mind</a>).</p>\n<p>The final blow came when Turing and Alonzo Church independently proved that no algorithm could exist that determined whether an arbitrary mathematical statement was true or false. (Church did this by inventing an entirely different system called the <a href=\"https://en.wikipedia.org/wiki/Lambda_calculus\">lambda calculus</a>, which would later inspire computer languages like <a href=\"https://en.wikipedia.org/wiki/Lisp_%28programming_language%29\">Lisp</a>.) The answer to the decision problem was negative.</p>\n<p>Turing’s key insight came in the first section of his famous 1936 paper, “On Computable Numbers, With an Application to the <em>Entscheidungsproblem</em>.” In order to rigorously formulate the decision problem (the “<em>Entscheidungsproblem</em>”), Turing first created a mathematical model of what it means to be a computer (today, machines that fit this model are known as “universal Turing machines”). As the logician Martin Davis describes it:</p>\n<blockquote>\n<p>Turing knew that an algorithm is typically specified by a list of rules that a person can follow in a precise mechanical manner, like a recipe in a cookbook. He was able to show that such a person could be limited to a few extremely simple basic actions without changing the final outcome of the computation.</p>\n<p>Then, by proving that no machine performing only those basic actions could determine whether or not a given proposed conclusion follows from given premises using Frege’s rules, he was able to conclude that no algorithm for the Entscheidungsproblem exists.</p>\n<p>As a byproduct, he found a mathematical model of an all-purpose computing machine.</p>\n</blockquote>\n<p>Next, Turing showed how a program could be stored inside a computer alongside the data upon which it operates. In today’s vocabulary, we’d say that he invented the “stored-program” architecture that underlies most modern computers:</p>\n<blockquote>\n<p>Before Turing, the general supposition was that in dealing with such machines the three categories — machine, program, and data — were entirely separate entities. The machine was a physical object; today we would call it hardware. The program was the plan for doing a computation, perhaps embodied in punched cards or connections of cables in a plugboard. Finally, the data was the numerical input. Turing’s universal machine showed that the distinctness of these three categories is an illusion.</p>\n</blockquote>\n<p>This was the first rigorous demonstration that any computing logic that could be encoded in hardware could also be encoded in software. The architecture Turing described was later dubbed the “Von Neumann architecture” — but modern historians generally agree it came from Turing, as, apparently, did Von Neumann <a href=\"https://en.wikipedia.org/wiki/Alan_Turing#cite_note-36\">himself</a>.</p>\n<p>Although, on a technical level, Hilbert’s program was a failure, the efforts along the way demonstrated that large swaths of mathematics could be constructed from logic. And after Shannon and Turing’s insights—showing the connections between electronics, logic and computing—it was now possible to export this new conceptual machinery over to computer design.</p>\n<p>During World War II, this theoretical work was put into practice, when government labs conscripted a number of elite logicians. Von Neumann joined the atomic bomb project at Los Alamos, where he worked on computer design to support physics research. In 1945, he wrote the <a href=\"http://www.virtualtravelog.net/wp/wp-content/media/2003-08-TheFirstDraft.pdf\">specification</a> of the EDVAC—the first stored-program, logic-based computer—which is generally considered the definitive source guide for modern computer design.</p>\n<p>Turing joined a secret unit at Bletchley Park, northwest of London, where he helped design computers that were instrumental in breaking German codes. His most enduring contribution to practical computer design was his specification of the ACE, or Automatic Computing Engine.</p>\n<p>As the first computers to be based on Boolean logic and stored-program architectures, the ACE and the EDVAC were similar in many ways. But they also had interesting differences, some of which foreshadowed modern debates in computer design. Von Neumann’s favored designs were similar to modern CISC (“complex”) processors, baking rich functionality into hardware. Turing’s design was more like modern RISC (“reduced”) processors, minimizing hardware complexity and pushing more work to software.</p>\n<p>Von Neumann thought computer programming would be a tedious, clerical job. Turing, by contrast, said computer programming “should be very fascinating. There need be no real danger of it ever becoming a drudge, for any processes that are quite mechanical may be turned over to the machine itself.”</p>\n<p>Since the 1940s, computer programming has become significantly more sophisticated. One thing that hasn’t changed is that it still primarily consists of programmers specifying rules for computers to follow. In philosophical terms, we’d say that computer programming has followed in the tradition of deductive logic, the branch of logic discussed above, which deals with the manipulation of symbols according to formal rules.</p>\n<p>In the past decade or so, programming has started to change with the growing popularity of machine learning, which involves creating frameworks for machines to learn via statistical inference. This has brought programming closer to the other main branch of logic, inductive logic, which deals with inferring rules from specific instances.</p>\n<p>Today’s most promising machine learning techniques use neural networks, which were first <a href=\"http://www.cse.chalmers.se/~coquand/AUTOMATA/mcp.pdf\">invented</a> in 1940s by Warren McCulloch and Walter Pitts, whose idea was to develop a calculus for neurons that could, like Boolean logic, be used to construct computer circuits. Neural networks remained esoteric until decades later when they were combined with statistical techniques, which allowed them to improve as they were fed more data. Recently, as computers have become increasingly adept at handling large data sets, these techniques have produced remarkable results. Programming in the future will likely mean exposing neural networks to the world and letting them learn.</p>\n<p>This would be a fitting second act to the story of computers. Logic began as a way to understand the laws of thought. It then helped create machines that could reason according to the rules of deductive logic. Today, deductive and inductive logic are being combined to create machines that both reason and learn. What began, in Boole’s words, with an investigation “concerning the nature and constitution of the human mind,” could result in the creation of new minds—artificial minds—that might someday match or even exceed our own.</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2017/01/16/gadgets-and-computers/",
"title": "Gadgets and Computers",
"description": "From Benedict Evans’ Cars as Feature Phones: This is a common theme in many classes of device: you start with a product that has a few electronic functions added, and ...",
"url": "https://cdixon.org/2017/01/16/gadgets-and-computers/",
"published": "2017-01-16T00:00:00.000Z",
"updated": "2017-01-16T00:00:00.000Z",
"content": "<p>From Benedict Evans’ <a href=\"http://ben-evans.com/benedictevans/2017/01/10/cars-as-featurephones\">Cars as Feature Phones</a>:</p>\n<blockquote>\n<p>This is a common theme in many classes of device: you start with a product that has a few electronic functions added, and then those functions are delivered with chips, and perhaps they gain an interface and then a screen, and more and more functions (and probably multi-function buttons) — and then, somehow, you’ve built a little weird custom computer without actually meaning to, and all the little silos of features and functions become unmanageable, both at an interface level and also at a fundamental engineering level, and the whole thing gets replaced by a real computer with a real software platform. And this new computer is almost certainly made by a different company.\nYou could see this problem very clearly at Motorola, which developed as many as two dozen ‘operating systems’ — for phones, pagers, satellite phones, car-control, industrial devices, chip evaluation boards and so on and so on, and picked them for each device out of a metaphorical parts bin just as you’d choose a sensor or battery or any other component. And boy, they really knew how to write operating systems — they had dozens! With, probably, ‘<a href=\"https://www.technologyreview.com/s/508231/many-cars-have-a-hundred-million-lines-of-code/\">millions of lines of code</a>’. This was exactly the right approach in 1995, but in 2005, again, the whole thing collapsed under its own weight, because they needed software as a platform rather than as a one-off component, and instead <a href=\"http://www.theregister.co.uk/Print/2012/11/29/rockman_on_motorola/\">they had a mess</a>.</p>\n</blockquote>\n<p>The iPhone was the first mainstream cell phone that was also a proper computer. It had a full-fledged operating system and a (mostly) open developer platform. We are likely seeing the same pattern play out across the <a href=\"https://medium.com/software-is-eating-the-world/what-s-next-in-computing-e54b870b80cc#.bmdmkoc13\">next generation of computers</a>: not only cars, but drones, IoT devices, wearables, etc. In the beginning, hardware-focused companies make gadgets with ever increasing laundry lists of features. Then a company with strong software expertise (often a new market entrant) comes along that replaces these feature-packed gadgets with full-fledged computers. These computers have proper (usually Unix-like) operating systems, open developer platforms, and streamlined user interfaces (increasingly, powered by AI).</p>\n<p>This process takes time to play out. Apple waited more than a decade from the initial popularity of cell phones to the release of the first iPhone. And sometimes you don’t know the significance of a new computing device until many years later. It wasn’t obvious until around 2012 that iOS and Android smartphones would become the dominant form of computing (recall Facebook’s “<a href=\"https://techcrunch.com/2012/10/19/facebook-mobile-first/\">pivot to mobile</a>” in 2012). Some people (including me) believe we’ve already entered the “computer phase” of consumer IoT with voice assistants like Alexa, but it will probably take years before we understand the enduring mainstream appeal of these devices.</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2016/09/25/as-edwin-land-ultimately-recognized-the-adoption-of-his-polarized-headlight-system-was-fatally/",
"title": "As [Edwin] Land ultimately recognized, the adoption of his [polarized headlight] system was fatally…",
"description": "As [Edwin] Land ultimately recognized, the adoption of his [polarized headlight] system was fatally hampered by the fact that there was no competitive advantage for any car company in using ...",
"url": "https://cdixon.org/2016/09/25/as-edwin-land-ultimately-recognized-the-adoption-of-his-polarized-headlight-system-was-fatally/",
"published": "2016-09-25T00:00:00.000Z",
"updated": "2016-09-25T00:00:00.000Z",
"content": "<p>As [Edwin] Land ultimately recognized, the adoption of his [polarized headlight] system was fatally hampered by the fact that there was no competitive advantage for any car company in using it first. Since all cars needed to incorporate the technology as simultaneously as possible, it was either going to be all, either voluntarily or as directed by the government, or none. No state or federal governmental agency ever stepped in to direct the adoption of the technology in the way that seat belts would be required decades later. Herbert Nichols, a journalist with the Christian Science Monitor who had followed the story, believed that the industry killed the idea even though the demonstrations clearly showed that the system worked. According to Nichols, the industry concluded that it “just didn’t need anything to sell automobiles. They realized they could sell all the automobiles they could make.” Thus, with no economic or competitive incentive, why bother with a system that clearly added costs and admittedly presented implementation issues? After more than two decades, Land reluctantly gave up the fight.</p>\n<p><strong>But he learned one very important lesson. “I knew then that I would never go into a commercial field that put a barrier between us and the customer.” Rather than deal with other companies as intermediaries, he would market his innovative products directly to the public. He believed “that the role of industry is to sense a deep human need, then bring science and technology to bear on filling that need. Any market already existing is inherently boring and dull.” Land, like Steve Jobs many decades later, believed that his company should “give people products they do not even know they want.” Fortunately, he already had such a product in mind.</strong></p>\n<p>— <em><a href=\"https://www.amazon.com/dp/B00OHRYYFO/\">A Triumph of Genius: Edwin Land, Polaroid, and the Kodak Patent War</a></em></p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2016/08/18/eleven-reasons-to-be-excited-about-the-future-of-technology/",
"title": "Eleven Reasons To Be Excited About The Future of Technology",
"description": "“The strongest force propelling human progress has been the swift advance and wide diffusion of technology.” — The Economist In the year 1820, a person could expect to live less ...",
"url": "https://cdixon.org/2016/08/18/eleven-reasons-to-be-excited-about-the-future-of-technology/",
"published": "2016-08-18T00:00:00.000Z",
"updated": "2016-08-18T00:00:00.000Z",
"content": "<blockquote>\n<p>“The strongest force propelling human progress has been the swift advance and wide diffusion of technology.” — <a href=\"http://www.economist.com/node/841842\">The Economist</a></p>\n</blockquote>\n<p>In the year 1820, a person could <a href=\"https://ourworldindata.org/life-expectancy/\">expect to live</a> less than 35 years, 94% of the global population <a href=\"https://ourworldindata.org/world-poverty/\">lived in extreme poverty</a>, and less that 20% of the population was literate. Today, human life expectancy is over 70 years, less that 10% of the global population lives in extreme poverty, and <a href=\"http://www.oecd.org/statistics/How-was-life.pdf\">over 80% of people</a> are literate. These improvements are due mainly to advances in technology, beginning in the industrial age and continuing today in the information age.</p>\n<p>There are many exciting new technologies that will continue to transform the world and improve human welfare. Here are eleven of them.</p>\n<h2>1. Self-Driving Cars</h2>\n<p>Self-driving cars exist today that are safer than human-driven cars in most driving conditions. Over the next 3–5 years they‘ll get even safer, and will begin to go mainstream.</p>\n<p><img src=\"images/1_HfoJs9tCyyr6VeLvD45wyQ.gif\" alt=\"\"></p>\n<p>The <a href=\"http://www.who.int/mediacentre/factsheets/fs358/en/\">World Health Organization estimates</a> that 1.25 million people die from car-related injuries per year. Half of the deaths are pedestrians, bicyclists, and motorcyclists hit by cars. Cars are the leading cause of death for people ages 15–29 years old.</p>\n<p><img src=\"images/1_SNGdeK4GNUhjL6wlh7sfJw.png\" alt=\"\"></p>\n<p>Just as cars reshaped the world in the 20th century, so will self-driving cars in the 21st century. In most cities, <a href=\"http://oldurbanist.blogspot.com.es/2011/12/we-are-25-looking-at-street-area.html\">between 20–30%</a> of usable space is taken up by parking spaces, and most cars are parked <a href=\"http://www.reinventingparking.org/2013/02/cars-are-parked-95-of-time-lets-check.html\">about 95%</a> of the time. Self-driving cars will be in almost continuous use (most likely hailed from a smartphone app), thereby dramatically reducing the need for parking. Cars will communicate with one another to avoid accidents and traffic jams, and riders will be able to spend commuting time on other activities like work, education, and socializing.</p>\n<p><img src=\"images/1_k6w2wkkREpVeu9_cS2xxtg.png\" alt=\"Source: Tech Insider\"></p>\n<h2>2. Clean Energy</h2>\n<p>Attempts to fight climate change by reducing the demand for energy <a href=\"https://en.wikipedia.org/wiki/World_energy_consumption\">haven’t worked</a>. Fortunately, scientists, engineers, and entrepreneurs have been working hard on the supply side to make clean energy convenient and cost-effective.</p>\n<p>Due to steady technological and manufacturing advances, the price of solar cells has <a href=\"http://www.saskwind.ca/wind-cost-decline/\">dropped 99.5% since 1977</a>. Solar will soon be more cost efficient than fossil fuels. The cost of wind energy has also dropped to an all-time low, and in the last decade represented about a <a href=\"http://energy.gov/articles/top-10-things-you-didnt-know-about-wind-power\">third of newly installed</a> US energy capacity.</p>\n<p>Forward thinking organizations are taking advantage of this. For example, in India there is an initiative to convert airports to self-sustaining clean energy.</p>\n<p><img src=\"images/1_idAW1ONI_iIeevzPaUv-pg.png\" alt=\"Airport in Kochi, India (source: Clean Technica)\"></p>\n<p>Tesla is making high-performance, affordable electric cars, and <a href=\"http://www.treehugger.com/cars/tesla-built-858-new-charging-stations-us-over-past-12-months.html\">installing</a> electric charging stations <a href=\"http://mashable.com/2016/04/01/tesla-supercharger-expansion/#v93tzyDFl5qR\">worldwide</a>.</p>\n<p><img src=\"images/1_YwcTRiWETVn4aXiZhEJtcg.png\" alt=\"Tesla Model 3 and US supercharger locations\"></p>\n<p>There are hopeful signs that clean energy could soon be reaching a tipping point. For example, in Japan, there are now more electric charging stations than gas stations.</p>\n<p><img src=\"images/1_RNmY6abYWA2n2W6EgP3lcA.png\" alt=\"Source: The Guardian\"></p>\n<p>And Germany produces so much renewable energy, it sometimes produces even more than it can use.</p>\n<p><img src=\"images/1_wETYiSDThJ5fQYIVWuw8aA.png\" alt=\"Source: Time Magazine\"></p>\n<h2>3. Virtual and Augmented Reality</h2>\n<p>Computer processors only recently became fast enough to power comfortable and convincing virtual and augmented reality experiences. Companies like Facebook, Google, Apple, and Microsoft are investing billions of dollars to make VR and AR more immersive, comfortable, and affordable.</p>\n<p><img src=\"images/1_6cmd8P-bPYRU1olrJHsvfw.gif\" alt=\"Toybox demo from Oculus\"></p>\n<p>People sometimes think VR and AR will be used only for gaming, but over time they will be used for all sorts of activities. For example, we’ll use them to manipulate 3-D objects:</p>\n<p><img src=\"images/1_q_pqQCTcTETf4G-ARUm00A.jpeg\" alt=\"Augmented reality computer interface (from Iron Man)\"></p>\n<p>To meet with friends and colleagues from around the world:</p>\n<p><img src=\"images/1_MJcHcqCWEzGxDIVDGpcHcA.jpeg\" alt=\"Augmented reality teleconference (from The Kingsman)\"></p>\n<p>And even for medical applications, like treating phobias or helping rehabilitate paralysis victims:</p>\n<p><img src=\"images/1_q_J7Ql2iVfdDYc5t6hM98Q.png\" alt=\"Source: New Scientist\"></p>\n<p>VR and AR have been dreamed about by science fiction fans for decades. In the next few years, they’ll finally become a mainstream reality.</p>\n<h2>4. Drones and Flying Cars</h2>\n<blockquote>\n<p>“Roads? Where we’re going we don’t need… roads.” — Dr. Emmet Brown</p>\n</blockquote>\n<p>GPS started out as a military technology but is now used to hail taxis, get mapping directions, and hunt Pokémon. Likewise, drones started out as a military technology, but are increasingly being used for a wide range of consumer and commercial applications.</p>\n<p>For example, drones are being used to inspect critical infrastructure like bridges and power lines, to survey areas struck by natural disasters, and many other creative uses like fighting animal poaching.</p>\n<p><img src=\"images/1_hLhAdWXECMyNLwrHfad6pA.png\" alt=\"Source: NBC News\"></p>\n<p>Amazon and Google are building drones to deliver household items.</p>\n<p><img src=\"images/1_s1eQciCtoaD_AaovzJouAA.gif\" alt=\"Amazon delivery drone\"></p>\n<p>The startup <a href=\"http://flyzipline.com/product/\">Zipline</a> uses drones to deliver medical supplies to remote villages that can’t be accessed by roads.</p>\n<p><img src=\"images/1_BDepNtZOTWXNOi5F4Dk3Dg.png\" alt=\"Source: The Verge\"></p>\n<p>There is also a new wave of startups working on flying cars (including <a href=\"http://www.bloomberg.com/news/articles/2016-06-09/welcome-to-larry-page-s-secret-flying-car-factories\">two</a> funded by the cofounder of Google, Larry Page).</p>\n<p><img src=\"images/1_FJyVIp3MI_k7mVM5obpSsA.png\" alt=\"The Terrafugia TF-X flying car (source)\"></p>\n<p>Flying cars use the same advanced technology used in drones but are large enough to carry people. Due to advances in materials, batteries, and software, flying cars will be significantly more affordable and convenient than today’s planes and helicopters.</p>\n<h2>5. Artificial Intelligence</h2>\n<p><img src=\"images/1_I2dRn7D8ZZM7nI2IvvMFDw.jpeg\" alt=\"\"></p>\n<blockquote>\n<p>‘’It may be a hundred years before a computer beats humans at Go — maybe even longer.” — <a href=\"http://www.nytimes.com/1997/07/29/science/to-test-a-powerful-computer-play-an-ancient-game.html?pagewanted=all\">New York Times, 1997</a></p>\n<p>“Master of Go Board Game Is Walloped by Google Computer Program” —<a href=\"http://www.nytimes.com/2016/03/10/world/asia/google-alphago-lee-se-dol.html\"> New York Times, 2016</a></p>\n</blockquote>\n<p>Artificial intelligence has made rapid advances in the last decade, due to new algorithms and massive increases in data collection and computing power.</p>\n<p>AI can be applied to almost any field. For example, in photography an AI technique called artistic style transfer transforms photographs into the style of a given painter:</p>\n<p><img src=\"images/1_aHFJuj-jhnP4zHY1dD7tRA.png\" alt=\"Source\"></p>\n<p>Google built an AI system that controls its datacenter power systems, saving hundreds of millions of dollars in energy costs.</p>\n<p><img src=\"images/1_HpTNGOsV1a0PpqjQZNXKEQ.png\" alt=\"Source: Bloomberg\"></p>\n<p>The broad promise of AI is to liberate people from repetitive mental tasks the same way the industrial revolution liberated people from repetitive physical tasks.</p>\n<blockquote>\n<p>“If AI can help humans become better chess players, it stands to reason that it can help us become better pilots, better doctors, better judges, better teachers.” — <a href=\"http://www.wired.com/2014/10/future-of-artificial-intelligence/\">Kevin Kelly</a></p>\n</blockquote>\n<p>Some people worry that AI will destroy jobs. History has shown that while new technology does indeed eliminate jobs, it also creates new and better jobs to replace them. For example, with advent of the personal computer, the number of typographer jobs dropped, but the increase in graphic designer jobs more than made up for it.</p>\n<p><img src=\"images/1_c_lt2s5TuSoOfmPb_Rv46w.png\" alt=\"Source: Harvard Business Review\"></p>\n<p>It is much easier to imagine jobs that will go away than new jobs that will be created. Today millions of people work as app developers, ride-sharing drivers, drone operators, and social media marketers— jobs that didn’t exist and would have been difficult to even imagine ten years ago.</p>\n<h2>6. Pocket Supercomputers for Everyone</h2>\n<p><img src=\"images/1_5tt6F_Cxnf5n7J5v6Lx0Ug.png\" alt=\"\"></p>\n<p>By 2020, 80% of adults on earth <a href=\"\">will have</a> an internet-connected smartphone. An iPhone 6 has about 2 billion transistors, roughly 625 times more transistors than a 1995 Intel Pentium computer. Today’s smartphones are what used to be considered supercomputers.</p>\n<p><img src=\"images/1_vovBLv3ePKce3dPrU3q9Lg.png\" alt=\"Visitors to the pope (source: Business Insider)\"></p>\n<p>Internet-connected smartphones give ordinary people abilities that, just a short time ago, were only available to an elite few:</p>\n<blockquote>\n<p>“Right now, a Masai warrior on a mobile phone in the middle of Kenya has better mobile communications than the president did 25 years ago. If he’s on a smart phone using Google, he has access to more information than the U.S. president did just 15 years ago.” — <a href=\"http://edition.cnn.com/2012/05/06/opinion/diamandis-abundance-innovation/\">Peter Diamandis</a></p>\n</blockquote>\n<h2>7. Cryptocurrencies and Blockchains</h2>\n<blockquote>\n<p>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are linked using hypertext.” — <a href=\"http://farmerandfarmer.org/mastery/builder.html\">Farmer & Farmer</a></p>\n</blockquote>\n<p>Protocols are the plumbing of the internet. Most of the protocols we use today were developed decades ago by academia and government. Since then, protocol development mostly stopped as energy shifted to developing proprietary systems like social networks and messaging apps.</p>\n<p>Cryptocurrency and blockchain technologies are <a href=\"http://avc.com/2016/07/the-golden-age-of-open-protocols/\">changing this</a> by providing a new business model for internet protocols. This year alone, <a href=\"https://medium.com/the-coinbase-blog/app-coins-and-the-dawn-of-the-decentralized-business-model-8b8c951e734f#.2atvp1cxd\">hundreds of millions of dollars</a> were raised for a broad range of innovative blockchain-based protocols.</p>\n<p>Protocols based on blockchains also have capabilities that previous protocols didn’t. For example, <a href=\"https://en.wikipedia.org/wiki/Ethereum\">Ethereum</a> is a new blockchain-based protocol that can be used to create smart contracts and trusted databases that are immune to corruption and censorship.</p>\n<h2>8. High-Quality Online Education</h2>\n<p>While college tuition <a href=\"http://www.cnbc.com/2015/06/16/why-college-costs-are-so-high-and-rising.html\">skyrockets</a>, anyone with a smartphone can study almost any topic online, accessing educational content that is mostly free and increasingly high-quality.</p>\n<p>Encyclopedia Britannica <a href=\"http://www.csmonitor.com/Business/Latest-News-Wires/2012/0314/Encyclopaedia-Britannica-After-244-years-in-print-only-digital-copies-sold\">used to cost $1,400</a>. Now anyone with a smartphone can instantly access Wikipedia. You used to have to go to school or buy programming books to learn computer programming. Now you can learn from a community of over 40 million programmers at <a href=\"http://stackoverflow.com\">Stack Overflow</a>. YouTube has millions of hours of free tutorials and lectures, many of which are produced by top professors and universities.</p>\n<p><img src=\"images/1_NZTqnqYbOPv6sf7gCVLz8g.png\" alt=\"UC Berkeley Physics on Youtube\"></p>\n<p>The quality of online education is getting better all the time. For the last 15 years, <a href=\"http://ocw.mit.edu/index.htm\">MIT has been recording lectures</a> and compiling materials that cover over 2000 courses.</p>\n<blockquote>\n<p>“The idea is simple: to publish all of our course materials online and make them widely available to everyone.” — Dick K.P. Yue, Professor, MIT School of Engineering</p>\n</blockquote>\n<p>As perhaps the greatest research university in the world, MIT has always been ahead of the trends. Over the next decade, expect many other schools to follow MIT’s lead.</p>\n<p><img src=\"images/1_W-i0QTotXS-K4MU9qbpylQ.png\" alt=\"Source: Futurism\"></p>\n<h2>9. Better Food through Science</h2>\n<p><img src=\"images/1_O5VQyJRhI2-sHYzZPrHSBQ.png\" alt=\"Source: National Geographic\"></p>\n<p>Earth is running out of farmable land and fresh water. This is partly because our food production systems are incredibly inefficient. It takes an astounding 1799 gallons of water to produce 1 pound of beef.</p>\n<p>Fortunately, a variety of new technologies are being developed to improve our food system.</p>\n<p>For example, entrepreneurs are developing new food products that are tasty and nutritious substitutes for traditional foods but far more environmentally friendly. The startup <a href=\"http://www.impossiblefoods.com/\">Impossible Foods</a> invented meat products that look and taste like the real thing but are actually made of plants.</p>\n<p><img src=\"images/1_bUV4b3Xp0mvvdA8dp1hMtA.png\" alt=\"Impossible Food’s plant-based burger (source: Tech Insider)\"></p>\n<p>Their burger <a href=\"http://www.impossiblefoods.com/our-burger\">uses</a> 95% less land, 74% less water, and produces 87% less greenhouse gas emissions than traditional burgers. Other startups are creating plant-based replacements for <a href=\"http://ripplefoods.com/\">milk</a>, <a href=\"https://www.hamptoncreek.com/\">eggs</a>, and other common foods. <a href=\"http://soylent.com/\">Soylent</a> is a healthy, inexpensive meal replacement that uses advanced engineered <a href=\"http://terravia.com/Terravia_Sustainability.pdf\">ingredients</a> that are much friendlier to the environment than traditional ingredients.</p>\n<p>Some of these products are developed using genetic modification, a powerful scientific technique that has been widely mischaracterized as dangerous. According to a <a href=\"https://www.geneticliteracyproject.org/2015/01/29/pewaaas-study-scientific-consensus-on-gmo-safety-stronger-than-for-global-warming/\">study</a> by the Pew Organization, 88% of scientists think genetically modified foods are safe.</p>\n<p>Another exciting development in food production is automated indoor farming. Due to advances in solar energy, sensors, lighting, robotics, and artificial intelligence, indoor farms have become viable alternatives to traditional outdoor farms.</p>\n<p><img src=\"images/1_0Jyjlgj1KU2yfBqo7quCLQ.png\" alt=\"Aerofarms indoor farm (Source: New York Times)\"></p>\n<p>Compared to traditional farms, automated indoor farms use roughly 10 times less water and land. Crops are harvested many more times per year, there is no dependency on weather, and no need to use pesticides.</p>\n<h2>10. Computerized Medicine</h2>\n<p>Until recently, computers have only been at the periphery of medicine, used primarily for research and record keeping. Today, the combination of computer science and medicine is leading to a variety of breakthroughs.</p>\n<p><img src=\"images/1_IjKrWZdlbB2ksis_Dmia5A.png\" alt=\"\"></p>\n<p>For example, just fifteen years ago, it cost $3B to sequence a human genome. Today, the cost is about a thousand dollars and continues to drop. Genetic sequencing will soon be a routine part of medicine.</p>\n<p>Genetic sequencing generates massive amounts of data that can be analyzed using powerful data analysis software. One application is analyzing <a href=\"http://a16z.com/2016/06/09/freenome/\">blood samples</a> for early detection of cancer. Further genetic analysis can help determine the <a href=\"http://www.businessinsider.com/super-cheap-genome-sequencing-by-2020-2014-10\">best course</a> of treatment.</p>\n<p>Another application of computers to medicine is in prosthetic limbs. Here a young girl is using prosthetic hands she controls using her upper-arm muscles:</p>\n<p><img src=\"images/1_jVH1wxchOJ5qJzT46s907A.gif\" alt=\"Source: Open Bionics\"></p>\n<p>Soon we’ll have the technology to control prothetic limbs with just our thoughts using <a href=\"http://news.uci.edu/feature/to-walk-again/\">brain-to-machine interfaces</a>.</p>\n<p>Computers are also becoming increasingly effective at diagnosing diseases. An artificial intelligence system recently diagnosed a rare disease that human doctors failed to diagnose by finding hidden patterns in 20 million cancer records.</p>\n<p><img src=\"images/1_OEgWlj9sp2mCV0PrT9yp8A.png\" alt=\"Source: International Business Times\"></p>\n<h2>11. A New Space Age</h2>\n<p>Since the beginning of the space age in the 1950s, the vast majority of space funding has come from governments. But that funding has been in decline: for example, NASA’s budget <a href=\"https://en.wikipedia.org/wiki/Budget_of_NASA\">dropped</a> from about 4.5% of the federal budget in the 1960s to about 0.5% of the federal budget today.</p>\n<p><img src=\"images/1_paniidrx59zPQjq_q6rUHA.png\" alt=\"Source: Fortune\"></p>\n<p>The good news is that private space companies have started filling the void. These companies provide a wide range of products and services, including rocket launches, scientific research, communications and imaging satellites, and emerging speculative business models like asteroid mining.</p>\n<p>The most famous private space company is Elon Musk’s SpaceX, which successfully sent rockets into space that can return home to be reused.</p>\n<p><img src=\"images/1_5iiaQsTBu1tQ_hTy8fupXg.gif\" alt=\"SpaceX Falcon 9 landing\"></p>\n<p>Perhaps the most intriguing private space company is <a href=\"http://www.planetaryresources.com/\">Planetary Resources</a>, which is trying to pioneer a new industry: mining minerals from asteroids.</p>\n<p><img src=\"images/1_6zvea6z14lJ6inZQsVBsBA.png\" alt=\"Asteroid mining\"></p>\n<p>If successful, asteroid mining could lead to a new gold rush in outer space. Like previous gold rushes, this could lead to speculative excess, but also dramatically increased funding for new technologies and infrastructure.</p>\n<hr>\n<p>These are just a few of the amazing technologies we’ll see developed in the coming decades. 2016 is just the beginning of a new age of wonders. As futurist Kevin Kelly <a href=\"https://www.linkedin.com/pulse/internet-still-beginning-its-kevin-kelly\">says</a>:</p>\n<blockquote>\n<p>If we could climb into a time machine, journey 30 years into the future, and from that vantage look back to today, we’d realize that most of the greatest products running the lives of citizens in 2050 were not invented until after 2016. People in the future will look at their holodecks and wearable virtual reality contact lenses and downloadable avatars and AI interfaces and say, “Oh, you didn’t really have the internet” — or whatever they’ll call it — “back then.”</p>\n<p>So, the truth: Right now, today, in 2016 is the best time to start up. There has never been a better day in the whole history of the world to invent something. There has never been a better time with more opportunities, more openings, lower barriers, higher benefit/ risk ratios, better returns, greater upside than now. Right now, this minute. This is the moment that folks in the future will look back at and say, “Oh, to have been alive and well back then!”</p>\n</blockquote>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2016/08/07/steve-jobs-supposedly-said-returning-to-apple-that-his-plan-was-to-stay-alive-and-grab-onto-the/",
"title": "“Steve Jobs supposedly said, returning to Apple, that his plan was to stay alive and grab onto the…",
"description": "“Steve Jobs supposedly said, returning to Apple, that his plan was to stay alive and grab onto the next big thing — to listen for the footsteps. He tried video, ...",
"url": "https://cdixon.org/2016/08/07/steve-jobs-supposedly-said-returning-to-apple-that-his-plan-was-to-stay-alive-and-grab-onto-the/",
"published": "2016-08-07T00:00:00.000Z",
"updated": "2016-08-07T00:00:00.000Z",
"content": "<p>“Steve Jobs supposedly said, returning to Apple, that his plan was to stay alive and grab onto the next big thing — to listen for the footsteps. He tried video, and a few other things, but he got there in the end. But he might not have.”</p>\n<p>From: <a href=\"http://ben-evans.com/benedictevans/2016/5/2/inevitability-in-technology\">Inevitability in technology</a></p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2016/08/07/source-ethereum-org/",
"title": "“Ether is a necessary element — a fuel — for operating the distributed application platform…",
"description": "“Ether is a necessary element — a fuel — for operating the distributed application platform Ethereum. It is a form of payment made by the clients of the platform to ...",
"url": "https://cdixon.org/2016/08/07/source-ethereum-org/",
"published": "2016-08-07T00:00:00.000Z",
"updated": "2016-08-07T00:00:00.000Z",
"content": "<p>“Ether is a necessary element — a fuel — for operating the distributed application platform Ethereum. It is a form of payment made by the clients of the platform to the machines executing the requested operations. To put it another way, ether is the incentive ensuring that developers write quality applications (wasteful code costs more), and that the network remains healthy (people are compensated for their contributed resources).</p>\n<p>Ether is to be treated as “crypto-fuel”, a token whose purpose is to pay for computation, and is not intended to be used as or considered a currency, asset, share or anything else.”</p>\n<p><em>Source: <a href=\"https://ethereum.org/ether\">ethereum.org</a></em></p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2016/07/30/if-you-asked-people-in-1989-what-they-needed-to-make-their-life-better-it-was-unlikely-that-they/",
"title": "“If you asked people in 1989 what they needed to make their life better, it was unlikely that they…",
"description": "“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are ...",
"url": "https://cdixon.org/2016/07/30/if-you-asked-people-in-1989-what-they-needed-to-make-their-life-better-it-was-unlikely-that-they/",
"published": "2016-07-30T00:00:00.000Z",
"updated": "2016-07-30T00:00:00.000Z",
"content": "<p>“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are linked using hypertext.”</p>\n<p>— <a href=\"http://farmerandfarmer.org/mastery/builder.html\">Farmer & Farmer\n</a></p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2016/05/11/the-typical-path-of-how-people-respond-to-life-changing-inventions/",
"title": "“The typical path of how people respond to life-changing inventions",
"description": "I’ve never heard of it. I’ve heard of it but don’t understand it. I understand it, but I don’t see how it’s useful. I see how it could be fun ...",
"url": "https://cdixon.org/2016/05/11/the-typical-path-of-how-people-respond-to-life-changing-inventions/",
"published": "2016-05-11T00:00:00.000Z",
"updated": "2016-05-11T00:00:00.000Z",
"content": "<ol>\n<li>\n<p>I’ve never heard of it.</p>\n</li>\n<li>\n<p>I’ve heard of it but don’t understand it.</p>\n</li>\n<li>\n<p>I understand it, but I don’t see how it’s useful.</p>\n</li>\n<li>\n<p>I see how it could be fun for rich people, but not me.</p>\n</li>\n<li>\n<p>I use it, but it’s just a toy.</p>\n</li>\n<li>\n<p>It’s becoming more useful for me.</p>\n</li>\n<li>\n<p>I use it all the time.</p>\n</li>\n<li>\n<p>I could not imagine life without it.</p>\n</li>\n<li>\n<p>Seriously, people lived without it?</p>\n</li>\n<li>\n<p>It’s too powerful and needs to be regulated”</p>\n</li>\n</ol>\n<p><em>Credits:</em></p>\n<p><em>#1–#9 by <a href=\"http://time.com/author/morgan-housel-the-motley-fool/\">Morgan Housel</a>, <a href=\"http://time.com/money/3940273/innovation-isnt-dead/\">Time</a></em></p>\n<p><em>#10 by <a href=\"https://twitter.com/peterpeirce/status/616664561068994560?lang=en\">@peterpeirce</a></em></p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2016/04/02/comma-ai/",
"title": "Comma.ai",
"description": "I wrote a blog post last month highlighting some of the exciting trends in the computing industry. One trend I discussed is the rapid progress in a branch of artificial ...",
"url": "https://cdixon.org/2016/04/02/comma-ai/",
"published": "2016-04-02T00:00:00.000Z",
"updated": "2016-04-02T00:00:00.000Z",
"content": "<p>I wrote a blog post last month highlighting some of the exciting trends in the computing industry. One trend I discussed is the rapid progress in a branch of artificial intelligence called deep learning. Big tech companies are making significant investments in deep learning, but there are also opportunities for startups:</p>\n<blockquote>\n<p>Many of the papers, <a href=\"https://code.google.com/archive/p/word2vec/\">data</a> <a href=\"http://image-net.org/download-images\">sets</a>, and <a href=\"https://www.tensorflow.org/\">software</a> <a href=\"http://deeplearning.net/software/theano/\">tools</a> related to deep learning have been open sourced. This has had a democratizing effect, allowing individuals and small organizations to build powerful applications. WhatsApp was able to build a global messaging system that <a href=\"http://www.wired.com/2015/09/whatsapp-serves-900-million-users-50-engineers/\">served 900M users with just 50 engineers</a>, compared to the thousands of engineers that were needed for prior generations of messaging systems. This “<a href=\"https://twitter.com/cdixon/status/473221599189954562\">WhatsApp effect</a>” is now happening in AI. Software tools like <a href=\"http://deeplearning.net/software/theano/\">Theano</a> and <a href=\"https://www.tensorflow.org/\">TensorFlow</a>, combined with cloud data centers for training, and inexpensive GPUs for deployment, allow small teams of engineers to build state-of-the-art AI systems.</p>\n</blockquote>\n<p>You might have seen <a href=\"http://www.bloomberg.com/features/2015-george-hotz-self-driving-car/\">recent press</a> coverage of a software developer named George Hotz who built his own self-driving car.</p>\n<p><img src=\"images/1U00Hr0kDEBcGUf87W4iPcQ.png\" alt=\"\"></p>\n<p>I first met George a few months ago, and, like a lot of people who had seen the press coverage, I was skeptical. How could someone build such an advanced system all by himself? After spending time with George, my skepticism turned into enthusiasm. I tested his car, and, along with some of my colleagues and friends with AI expertise, dug into the details of the deep learning system he’d developed.</p>\n<p><img src=\"images/1xJP7l8qL4IbNyJnwHYNwdA.gif\" alt=\"Comma’s self-driving car\"></p>\n<p>I came away convinced that George’s system is a textbook example of the “WhatsApp effect” happening to AI.</p>\n<p><img src=\"images/1d9qMneOOvDP2WHCxgakQkw.png\" alt=\"George with test car #1\"></p>\n<p>George is certainly brilliant (he’s a <a href=\"https://en.wikipedia.org/wiki/George_Hotz\">famous hacker</a> for a reason), and he’s no longer alone: he’s now working with a small team of machine learning experts. But he’s also riding a wave of exponentially improving hardware, software, and, most importantly, data. The more his system gets used, the more data it collects, and the smarter it becomes.</p>\n<p>Today we are announcing that <a href=\"http://a16z.com/\">a16z</a> is leading a $3.1M investment in George’s company, <a href=\"http://comma.ai/\">Comma.ai</a>. This investment will help them continue to build their team (they’re <a href=\"http://comma.ai/hiring.html\">hiring</a>), and bring their technology to market. Expect more announcements from Comma in the next few months. We are very excited to support George and his team on this ambitious project.</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2016/03/13/the-internet-economy/",
"title": "The Internet Economy",
"description": "We are living in an era of bundling. The big five consumer tech companies — Google, Apple, Facebook, Amazon, and Microsoft — have moved far beyond their original product lines ...",
"url": "https://cdixon.org/2016/03/13/the-internet-economy/",
"published": "2016-03-13T00:00:00.000Z",
"updated": "2016-03-13T00:00:00.000Z",
"content": "<p>We are living in an era of bundling. The big five consumer tech companies — Google, Apple, Facebook, Amazon, and Microsoft — have moved far beyond their original product lines into all sorts of hardware, software, and services that overlap and compete with one another. But their revenues and profits still depend heavily on external technologies that are outside of their control. One way to visualize these external dependencies is to consider the path of a typical internet session, from the user to some revenue-generating action, and then (in some cases) back again to the user:</p>\n<p><img src=\"images/1bUnzLePRb7E25uoUEMYQgA.png\" alt=\"\"></p>\n<p>When evaluating an internet company’s strategic position (the defensibility of its profit <a href=\"http://www.investopedia.com/terms/e/economicmoat.asp\">moat</a>), you need to consider: 1) how the company generates revenue and profits, 2) the loop in its entirety, not just the layers in which the company has products.</p>\n<p>For example, it might seem counterintuitive that Amazon is a <a href=\"/2010/05/22/while-google-fights-on-the-edges-amazon-is-attacking-their-core/\">major threat</a> to Google’s core search business. But you can see this by following the money through the loop: a <a href=\"http://www.wordstream.com/articles/google-earnings\">significant portion</a> of Google’s revenue comes from search queries for things that can be bought on Amazon, and the buying experience on Amazon (from initial purchasing intent to consumption/unboxing) is significantly better than the buying experience on most non-Amazon e-commerce sites you find via Google searches. After a while, shoppers learn to skip Google and go straight to Amazon.</p>\n<p>Think of the internet economic loop as a model train track. Positions in front of you can redirect traffic around you. Positions after you can build new tracks that bypass you. New technologies come along (which often look <a href=\"/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy/\">toy-like</a> and unthreatening at first) that create entirely new tracks that render the previous tracks obsolete.</p>\n<p>There are interesting developments happening at each layer of the loop (and there are many smaller, offshoot loops not depicted in the chart above), but at any given time certain layers are industry flash points. The most prominent recent battle was between mobile devices and operating systems. That battle seems to be over, with Android software and iOS devices having won. Possible future flash points include:</p>\n<p><strong>The automation of logistics.</strong> Today’s logistics network is a patchwork of ships, planes, trucks, warehouses, and people. Tomorrow’s network will include significantly more automation, from robotic warehouses to autonomous cars, trucks, drones, and <a href=\"http://fortune.com/2016/04/06/dispatch-carry-delivery-robot/\">delivery bots</a>. This transition will happen in stages, depending on the economics of specific goods and customers, along with geographic and regulatory factors. Amazon of course has a huge advantage in logistics. Google has tried repeatedly to get into logistics with <a href=\"http://recode.net/2015/08/19/google-express-plans-to-shut-down-its-two-delivery-hubs/\">little success</a>. On-demand ride-sharing and delivery startups could play an interesting role here. The logistics layer is critical for e-commerce, which in turn is critical for monetizing search. Amazon’s dominance in logistics gives it a very strong strategic moat as e-commerce continues to take market share from traditional retail.</p>\n<p><strong>Web vs apps</strong>. The mobile web <a href=\"/2014/04/07/the-decline-of-the-mobile-web/\">is</a> <a href=\"http://daringfireball.net/2014/04/rethinking_what_we_mean_by_mobile_web\">arguably</a> in decline: users are spending more time on mobile devices, and more time in apps instead of web browsers. Apple has joined the app side of this battle (e.g. allowing ad blockers in Safari, encouraging app install <a href=\"https://developer.apple.com/library/ios/documentation/AppleApplications/Reference/SafariWebContent/PromotingAppswithAppBanners/PromotingAppswithAppBanners.html\">smart banners</a> above websites). Facebook has also taken the app side (e.g. encouraging publishers to use <a href=\"https://instantarticles.fb.com/\">Instant Articles</a> instead of web views). Google of course needs a vibrant web for its search engine to remain useful, so has joined the web side of the battle (e.g. <a href=\"http://techcrunch.com/2015/09/01/death-to-app-install-interstitials/\">punishing websites</a> that have interstitial app ads, developing <a href=\"https://www.ampproject.org/\">technologies</a> that reduce website loading times). The realistic danger isn’t that the web disappears, but that it gets marginalized, and that the bulk of monetizable internet activities happen in apps or other interfaces like voice or messaging bots. This shift could have a significant effect on web publishers who rely on older business models like non-native ads, and could make it harder for small startups to grow beyond niche use cases.</p>\n<p><strong>Video: from TV to mobile devices.</strong> Internet companies are betting that video consumption will continue to shift from TV to mobile devices. The hope is that this will not only create compelling user experiences, but also unlock access to the tens of billions of ad dollars that are currently spent on TV.</p>\n<blockquote>\n<p>“I think video is a mega trend, almost as big as mobile.” — <a href=\"https://twitter.com/cdixon/status/706198805922902018\">Mark Zuckerberg</a></p>\n</blockquote>\n<p>Last decade, the internet won the market for ads that harvest purchasing intent (ads that used to appear in newspapers and yellow pages), with most of the winnings going to Google. The question for the next decade is who will win the market for ads that generate purchasing intent (so far the winner is Facebook, followed by Google). Most likely this will depend on who controls the user flow to video advertising. Today, the biggest video platforms are Facebook and YouTube, but expect video to get embedded into almost every internet service, similar to how the internet transitioned from text-heavy to image-heavy services last decade.</p>\n<p><strong>Voice: baking search into the OS.</strong> Voice bots like Siri, Google Now, and Alexa embed search-like capabilities directly into the operating system. Today, the quality of voice interfaces isn’t good enough to replace visual computing interfaces for most activities. However, artificial intelligence is <a href=\"https://medium.com/software-is-eating-the-world/what-s-next-in-computing-e54b870b80cc#.kyn1qnbvj\">improving</a> rapidly. Voice bots should be be able to handle much more nuanced and interactive conversations in the near future.</p>\n<p>Amazon’s <a href=\"https://developer.amazon.com/public/solutions/alexa/alexa-voice-service\">vision</a> here is the most ambitious: to embed voice services in every possible device, thereby reducing the importance of the device, OS, and application layers (it’s no coincidence that those are also the layers in which Amazon is the weakest). But all the big tech companies are investing heavily in voice and AI. As Google CEO Sundar Pichai recently <a href=\"https://googleblog.blogspot.com/2016/04/this-years-founders-letter.html\">said</a>:</p>\n<blockquote>\n<p>The next big step will be for the very concept of the “device” to fade away. Over time, the computer itself — whatever its form factor — will be an intelligent assistant helping you through your day. We will move from mobile first to an AI first world.</p>\n</blockquote>\n<p>This would mean that AI interfaces — which in most cases will mean voice interfaces — could become the master routers of the internet economic loop, rendering many of the other layers interchangeable or irrelevant. Voice is mostly a novelty today, but in technology the <a href=\"/2010/01/03/the-next-big-thing-will-start-out-looking-like-a-toy/\">next big thing</a> often starts out looking that way.</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
},
{
"id": "https://cdixon.org/2016/02/21/what-s-next-in-computing/",
"title": "What’s Next in Computing?",
"description": "The computing industry progresses in two mostly independent cycles: financial and product cycles. There has been a lot of handwringing lately about where we are in the financial cycle. Financial ...",
"url": "https://cdixon.org/2016/02/21/what-s-next-in-computing/",
"published": "2016-02-21T00:00:00.000Z",
"updated": "2016-02-21T00:00:00.000Z",
"content": "<p>The computing industry progresses in two mostly independent cycles: financial and product cycles. There has been a lot of handwringing lately about where we are in the financial cycle. Financial markets get a lot of attention. They tend to fluctuate unpredictably and sometimes wildly. The product cycle by comparison gets relatively little attention, even though it is what actually drives the computing industry forward. We can try to understand and predict the product cycle by studying the past and extrapolating into the future.</p>\n<p><img src=\"images/1_Gzmn-yCmeOGEVPrrq9esMA.png\" alt=\"New computing eras have occurred every 10–15 years\"></p>\n<p>Tech product cycles are mutually reinforcing interactions between platforms and applications. New platforms enable new applications, which in turn make the new platforms more valuable, creating a positive feedback loop. Smaller, offshoot tech cycles happen all the time, but every once in a while — historically, about every 10 to 15 years — major new cycles begin that completely reshape the computing landscape.</p>\n<p><img src=\"images/1_oOZjdUvjYRlrFtYUKLIMGg.png\" alt=\"Financial and product cycles evolve mostly independently\"></p>\n<p>The PC enabled entrepreneurs to create word processors, spreadsheets, and many other desktop applications. The internet enabled search engines, e-commerce, e-mail and messaging, social networking, SaaS business applications, and many other services. Smartphones enabled mobile messaging, mobile social networking, and on-demand services like ride sharing. Today, we are in the middle of the mobile era. It is likely that many more mobile innovations are still to come.</p>\n<p>Each product era can be divided into two phases: 1) <em>the gestation phase</em>, when the new platform is first introduced but is expensive, incomplete, and/or difficult to use, 2) <em>the growth phase</em>, when a new product comes along that solves those problems, kicking off a period of exponential growth.</p>\n<p>The Apple II was released in 1977 (and the Altair in 1975), but it was the release of the IBM PC in 1981 that kicked off the PC growth phase.</p>\n<p><img src=\"images/1_vfatwon6YWQGRvYad2ggqw.png\" alt=\"PC sales per year (thousands), source: http://jeremyreimer.com/m-item.lsp?i=137\"></p>\n<p>The internet’s gestation phase took place in the <a href=\"https://en.wikipedia.org/wiki/National_Science_Foundation_Network\">80s and early 90s</a> when it was mostly a text-based tool used by academia and government. The release of the Mosaic web browser in 1993 started the growth phase, which has continued ever since.</p>\n<p><img src=\"images/1_6jgrfjHpBKlObla1x0NYtg.png\" alt=\"Worldwide internet users, source: http://churchm.ag/numbers-internet-use/\"></p>\n<p>There were feature phones in the 90s and early smartphones like the Sidekick and Blackberry in the early 2000s, but the smartphone growth phase really started in 2007–8 with the release of the iPhone and then Android. Smartphone adoption has since exploded: about 2B people have smartphones today. By 2020, <a href=\"http://ben-evans.com/benedictevans/2014/10/28/presentation-mobile-is-eating-the-world\">80% of the global population</a> will have one.</p>\n<p><img src=\"images/1_8o0-IQSyDQ0KRxSVV2njdA.png\" alt=\"Worldwide smartphone sales per year (millions)\"></p>\n<p>If the 10–15 year pattern repeats itself, the next computing era should enter its growth phase in the next few years. In that scenario, we should already be in the gestation phase. There are a number of important trends in both hardware and software that give us a glimpse into what the next era of computing might be. Here I talk about those trends and then make some suggestions about what the future might look like.</p>\n<h2>Hardware: small, cheap, and ubiquitous</h2>\n<p>In the mainframe era, only large organizations could afford a computer. Minicomputers were affordable by smaller organization, PCs by homes and offices, and smartphones by individuals.</p>\n<p><img src=\"images/1_gZQE6-shm1dqgJAbmNn6ww.png\" alt=\"Computers are getting steadily smaller, source: http://www.nature.com/news/the-chips-are-down-for-moore-s-law-1.19338\"></p>\n<p>We are now entering an era in which processors and sensors are getting so small and cheap that there will be many more computers than there are people.</p>\n<p>There are two reasons for this. One is the steady progress of the semiconductor industry over the past 50 years (<a href=\"https://en.wikipedia.org/wiki/Moore%27s_law\">Moore’s law</a>). The second is what Chris Anderson <a href=\"http://foreignpolicy.com/2013/04/29/epiphanies-from-chris-anderson/\">calls</a> “the peace dividend of the smartphone war”: the runaway success of smartphones led to massive investments in processors and sensors. If you disassemble a modern drone, VR headset, or IoT devices, you’ll find mostly smartphone components.</p>\n<p>In the modern semiconductor era, the focus has shifted from standalone CPUs to <a href=\"https://medium.com/@magicsilicon/how-the-soc-is-displacing-the-cpu-49bc7503edab#.h6wfmbk8n\">bundles</a> of specialized chips known as systems-on-a-chip.</p>\n<p><img src=\"images/1_SwUUpb2cjLIPFa3-8U9LzQ.png\" alt=\"Computer prices have been steadily dropping, souce: https://medium.com/@magicsilicon/computing-transitions-22c07b9c457a#.j4cm9m6qu%5C\"></p>\n<p>Typical systems-on-a-chip bundle energy-efficient ARM CPUs plus specialized chips for graphics processing, communications, power management, video processing, and more.</p>\n<p><img src=\"images/1_Wz-CMXmQFd64yFKWFfHefQ.jpeg\" alt=\"Raspberry Pi Zero: 1 GHz Linux computer for $5\"></p>\n<p>This new architecture has dropped the price of basic computing systems from about $100 to about $10. The <a href=\"https://www.raspberrypi.org/blog/raspberry-pi-zero/\">Raspberry Pi Zero</a> is a 1 GHz Linux computer that you can buy for $5. For a similar price you can buy a <a href=\"http://makezine.com/2015/04/01/esp8266-5-microcontroller-wi-fi-now-arduino-compatible/\">wifi-enabled microcontroller</a> that runs a version of Python. Soon these chips will cost less than a dollar. It will be cost-effective to embed a computer in almost anything.</p>\n<p>Meanwhile, there are still impressive performance improvements happening in high-end processors. Of particular importance are GPUs (graphics processors), the best of which are made by Nvidia. GPUs are useful not only for traditional graphics processing, but also for machine learning algorithms and virtual/augmented reality devices. Nvidia’s <a href=\"http://www.extremetech.com/gaming/201417-nvidias-2016-roadmap-shows-huge-performance-gains-from-upcoming-pascal-architecture\">roadmap</a> promises significant performance improvements in the coming years.</p>\n<p><img src=\"images/1_jSQ-qKGSVgW4rSwA0dk9ZQ.png\" alt=\"Google’s quantum computer, source: https://www.technologyreview.com/s/544421/googles-quantum-dream-machine/\"></p>\n<p>A wildcard technology is quantum computing, which today exists mostly in laboratories but if made commercially viable could lead to orders-of-magnitude performance improvements for certain classes of algorithms in fields like biology and artificial intelligence.</p>\n<h2>Software: the golden age of AI</h2>\n<p>There are many exciting things happening in software today. Distributed systems is one good example. As the number of devices has grown exponentially, it has become increasingly important to 1) parallelize tasks across multiple machines 2) communicate and coordinate among devices. Interesting distributed systems technologies include systems like <a href=\"http://hadoop.apache.org/\">Hadoop</a> and <a href=\"https://amplab.cs.berkeley.edu/projects/spark-lightning-fast-cluster-computing/\">Spark</a> for parallelizing big data problems, and Bitcoin/blockchain for securing data and assets.</p>\n<p>But perhaps the most exciting software breakthroughs are happening in artificial intelligence (AI). AI has a long history of hype and disappointment. Alan Turing himself <a href=\"http://loebner.net/Prizef/TuringArticle.html\">predicted</a> that machines would be able to successfully imitate humans by the year 2000. However, there are good reasons to think that AI might now finally be entering a golden age.</p>\n<blockquote>\n<p>“Machine learning is a core, transformative way by which we’re rethinking everything we’re doing.” — Google CEO, Sundar Pichai</p>\n</blockquote>\n<p>A lot of the excitement in AI has focused on deep learning, a machine learning technique that was <a href=\"http://www.nytimes.com/2012/06/26/technology/in-a-big-network-of-computers-evidence-of-machine-learning.html?pagewanted=all\">popularized</a> by a now famous 2012 Google project that used a giant cluster of computers to learn to identify cats in YouTube videos. Deep learning is a descendent of neural networks, a technology that <a href=\"https://en.wikipedia.org/wiki/Artificial_neural_network#History\">dates back</a> to the 1940s. It was brought back to life by a <a href=\"http://www.wired.com/2014/10/future-of-artificial-intelligence/\">combination</a> of factors, including new algorithms, cheap parallel computation, and the widespread availability of large data sets.</p>\n<p><img src=\"images/1_P4BXse9pJYAUbasCEkQanA.png\" alt=\"ImageNet challenge error rates, souce: http://www.slideshare.net/nervanasys/sd-meetup-12215 (red line = human performance)\"></p>\n<p>It’s tempting to dismiss deep learning as another Silicon Valley buzzword. The excitement, however, is supported by impressive theoretical and real-world results. For example, the error rates for the winners of the <a href=\"http://image-net.org/challenges/LSVRC/2015/\">ImageNet challenge</a> — a popular machine vision contest — were in the 20–30% range prior to the use of deep learning. Using deep learning, the accuracy of the winning algorithms has steadily improved, and in 2015 surpassed human performance.</p>\n<p>Many of the papers, <a href=\"https://code.google.com/archive/p/word2vec/\">data</a> <a href=\"http://image-net.org/download-images\">sets</a>, and <a href=\"https://www.tensorflow.org/\">software</a> <a href=\"http://deeplearning.net/software/theano/\">tools</a> related to deep learning have been open sourced. This has had a democratizing effect, allowing individuals and small organizations to build powerful applications. WhatsApp was able to build a global messaging system that <a href=\"http://www.wired.com/2015/09/whatsapp-serves-900-million-users-50-engineers/\">served 900M users with just 50 engineers</a>, compared to the thousands of engineers that were needed for prior generations of messaging systems. This “<a href=\"https://twitter.com/cdixon/status/473221599189954562\">WhatsApp effect</a>” is now happening in AI. Software tools like <a href=\"http://deeplearning.net/software/theano/\">Theano</a> and <a href=\"https://www.tensorflow.org/\">TensorFlow</a>, combined with cloud data centers for training, and inexpensive GPUs for deployment, allow small teams of engineers to build state-of-the-art AI systems.</p>\n<p>For example, here a <a href=\"http://tinyclouds.org/colorize/\">solo programmer</a> working on a side project used TensorFlow to colorize black-and-white photos:</p>\n<p><img src=\"images/1_L6cT-HQMC-mc34kJ450pdA.png\" alt=\"Left: black and white. Middle: automatically colorized. Right: true color. source: http://tinyclouds.org/colorize/\"></p>\n<p>And here a small startup created a real-time object classifier:</p>\n<p><img src=\"images/1_cAtej8oZh2u80cii--YgTw.gif\" alt=\"Teradeep real-time object classifier, source: https://www.youtube.com/watch?v=_wXHR-lad-Q \"></p>\n<p>Which of course is reminiscent of a famous scene from a sci-fi movie:</p>\n<p><img src=\"images/1_wiG-xc456HpdBkRTQi84Eg.gif\" alt=\"The Terminator (1984), source: https://www.youtube.com/watch?v=YvRb9jZ9wFk\"></p>\n<p>One of the first applications of deep learning released by a big tech company is the search function in Google Photos, which is <a href=\"http://gizmodo.com/google-photos-hands-on-so-good-im-creeped-out-1707566376\">shockingly</a> smart.</p>\n<p><img src=\"images/1_N1K_Wv2M-QDMF7FeOmJfcw.gif\" alt=\"User searches photos (w/o metadata) for “big ben”\"></p>\n<p>We’ll soon see significant upgrades to the intelligence of all sorts of products, including: voice assistants, search engines, <a href=\"http://www.wired.com/2015/08/how-facebook-m-works/\">chat bots</a>, 3D <a href=\"https://www.google.com/atap/project-tango/\">scanners</a>, language translators, automobiles, drones, medical imaging systems, and much more.</p>\n<blockquote>\n<p>The business plans of the next 10,000 startups are easy to forecast: Take X and add AI. This is a big deal, and now it’s here. — <a href=\"http://www.wired.com/2014/10/future-of-artificial-intelligence/\">Kevin Kelly</a></p>\n</blockquote>\n<p>Startups building AI products will need to stay laser focused on specific applications to compete against the big tech companies who have made AI a top priority. AI systems get better as more data is collected, which means it’s possible to create a virtuous flywheel of <a href=\"http://mattturck.com/2016/01/04/the-power-of-data-network-effects/\">data network effects</a> (more users → more data → better products → more users). The mapping startup Waze <a href=\"https://digit.hbs.org/submission/waze-generating-better-maps-through-its-network-of-users/\">used</a> data network effects to produce better maps than its vastly better capitalized competitors. Successful AI startups will follow a <a href=\"/2015/02/01/the-ai-startup-idea-maze/\">similar</a> strategy.</p>\n<h2>Software + hardware: the new computers</h2>\n<p>There are a variety of new computing platforms currently in the gestation phase that will soon get much better — and possibly enter the growth phase — as they incorporate recent advances in hardware and software. Although they are designed and packaged very differently, they share a common theme: they give us new and augmented abilities by embedding a smart virtualization layer on top of the world. Here is a brief overview of some of the new platforms:</p>\n<p><strong>Cars</strong>. Big tech companies like Google, Apple, Uber, and Tesla are investing significant resources in autonomous cars. Semi-autonomous cars like the Tesla Model S are already publicly available and will improve quickly. Full autonomy will take longer but is probably not more than 5 years away. There already exist fully autonomous cars that are almost as good as human drivers. However, for cultural and regulatory reasons, fully autonomous cars will likely need to be significantly better than human drivers before they are widely permitted.</p>\n<p><img src=\"images/1_nJjPHXo_qBtzvoH8OLx9hQ.gif\" alt=\"Autonomous car mapping its environment\"></p>\n<p>Expect to see a lot more investment in autonomous cars. In addition to the big tech companies, the big auto makers <a href=\"http://www.cnet.com/roadshow/news/gm-new-team-electric-autonomous-cars/\">are</a> <a href=\"http://spectrum.ieee.org/automaton/robotics/industrial-robots/toyota-to-invest-1-billion-in-ai-and-robotics-rd\">starting</a> <a href=\"https://media.ford.com/content/fordmedia/fna/us/en/news/2016/01/05/ford-tripling-autonomous-vehicle-development-fleet--accelerating.html\">to</a> take autonomy very seriously. You’ll even see some interesting products made by startups. Deep learning software tools have gotten so good that a <a href=\"http://www.bloomberg.com/features/2015-george-hotz-self-driving-car/\">solo programmer</a> was able to make a semi-autonomous car:</p>\n<p><img src=\"images/1_z442b_u8RfSqBEyI-1AkxQ.gif\" alt=\"Homebrew self-driving car, source: https://www.youtube.com/watch?v=KTrgRYa2wbI\"></p>\n<p><strong>Drones</strong>. Today’s consumer drones contain modern hardware (mostly smartphone components plus mechanical parts), but relatively simple software. In the near future, we’ll see drones that incorporate advanced computer vision and other AI to make them safer, easier to pilot, and more useful. Recreational videography will continue to be popular, but there will also be important <a href=\"http://www.airware.com\">commercial</a> use cases. There are tens of millions of <a href=\"http://www.psmag.com/politics-and-law/cell-tower-climbers-die-78374\">dangerous</a> jobs that involve climbing buildings, towers, and other structures that can be performed much more safely and effectively using drones.</p>\n<p><img src=\"images/1_N7SlK3WKwkfZ6v50JFLkCg.gif\" alt=\"Fully autonomous drone flight. source: https://www.youtube.com/watch?v=rYhPDn48-Sg\"></p>\n<p><strong>Internet of Things</strong>. The obvious use cases for IoT devices are energy savings, security, and convenience. <a href=\"https://nest.com/thermostat/meet-nest-thermostat/\">Nest</a> and <a href=\"https://nest.com/camera/meet-nest-cam/\">Dropcam</a> are popular examples of the first two categories. One of the most interesting products in the convenience category is Amazon’s <a href=\"http://www.amazon.com/Amazon-SK705DI-Echo/dp/B00X4WHP5E\">Echo</a>.</p>\n<p><img src=\"images/1_bsxhmUfI-7biIF-dW8a80w.png\" alt=\"Three main uses cases for IoT\"></p>\n<p>Most people think Echo is a gimmick until they try it and then they are <a href=\"http://qz.com/611026/amazon-echo-is-a-sleeper-hit-and-the-rest-of-america-is-about-find-out-about-it-for-the-first-time/\">surprised</a> at how useful it is. It’s a great <a href=\"https://500ish.com/alexa-5f7924bffcf3#.iou9jsaj4\">demo</a> of how effective always-on voice can be as a user interface. It will be a while before we have bots with generalized intelligence that can carry on full conversations. But, as Echo shows, voice can succeed today in constrained contexts. Language understanding should improve quickly as recent breakthroughs in deep learning make their way into production devices.</p>\n<p>IoT will also be adopted in business contexts. For example, devices with sensors and network connections are extremely <a href=\"https://www.samsara.com/\">useful</a> for monitoring industrial equipment.</p>\n<p><strong>Wearables.</strong> Today’s wearable computers are constrained along multiple dimensions, including battery, communications, and processing. The ones that have succeeded have focused on narrow applications like fitness monitoring. As hardware components continue to improve, wearables will support rich applications the way smartphones do, unlocking a wide range of new applications. As with IoT, voice will probably be the main user interface.</p>\n<p><img src=\"images/1__4r-bIpz7jWMYiLnxKFCJQ.gif\" alt=\"Wearable, super intelligent AI earpiece in the movie “Her”\"></p>\n<p><strong>Virtual Reality.</strong> 2016 is an exciting year for VR: the launch of the <a href=\"https://www.oculus.com/en-us/rift/\">Oculus Rift</a> and HTC/Valve <a href=\"https://www.htcvive.com/us/\">Vive</a> (and, possibly, the Sony Playstation VR), means that comfortable and immersive VR systems will finally be publicly available. VR systems need to be really good to avoid the “<a href=\"https://en.wikipedia.org/wiki/Uncanny_valley\">uncanny valley</a>” trap. Proper VR requires special screens (high resolution, high refresh rate, low persistence), powerful graphics cards, and the ability to track the precise position of the user (previously released VR systems could only track the rotation of the user’s head). This year, the public will for the first time get to experience what is known as “<a href=\"http://a16z.com/2015/01/22/virtual-reality/\">presence</a>” — when your senses are sufficiently tricked that you feel fully transported into the virtual world.</p>\n<p><img src=\"images/1_bcHvjQwlLxyORwjHFH87Qg.gif\" alt=\"Oculus Rift Toybox demo\"></p>\n<p>VR headsets will continue to improve and get more affordable. Major areas of research will include: 1) new tools for creating rendered and/or <a href=\"https://www.lytro.com/\">filmed</a> VR content, 2) machine vision for <a href=\"http://venturebeat.com/2016/02/08/oculus-vr-guru-john-carmack-leads-crucial-position-tracking-development-for-mobile-vr/\">tracking</a> and scanning directly from phones and headsets, and 3) distributed back-end <a href=\"/2015/03/24/improbable-enabling-the-development-of-large-scale-simulated-worlds/\">systems</a> for hosting large <a href=\"https://twitter.com/cdixon/status/662836035508940800\">virtual environments</a>.</p>\n<p><img src=\"images/1_Fv9_4fCAOHoEA3dxjMf2jw.gif\" alt=\"3D world creation in room-scale VR\"></p>\n<p><strong>Augmented Reality</strong>. AR will likely arrive after VR because AR requires most of what VR requires plus additional new technologies. For example, AR requires advanced, low-latency machine vision in order to convincingly combine real and virtual objects in the same interactive scene.</p>\n<p><img src=\"images/1_HpWBUZD_kKAoTa2yuxqnTQ.jpeg\" alt=\"Real and virtual combined (from The Kingsmen)\"></p>\n<p>That said, AR is probably coming sooner than you think. This demo video was shot directly through <a href=\"http://www.magicleap.com/#/home\">Magic Leap’s</a> AR device:</p>\n<p><img src=\"images/1_7jbz4N1GZTFm0wDzDEmQ1Q.gif\" alt=\"Magic Leap demo: real environment, virtual character\"></p>\n<h2>What’s next?</h2>\n<p>It is possible that the pattern of 10–15 year computing cycles has ended and mobile is the final era. It is also possible the next era won’t arrive for a while, or that only a subset of the new computing categories discussed above will end up being important.</p>\n<p>I tend to think we are on the cusp of not one but multiple new eras. The “peace dividend of the smartphone war” created a Cambrian explosion of new devices, and developments in software, especially AI, will make those devices smart and useful. Many of the futuristic technologies discussed above exist today, and will be broadly accessible in the near future.</p>\n<p>Observers have noted that many of these new devices are in their “<a href=\"http://www.nytimes.com/2016/01/07/technology/on-display-at-ces-tech-ideas-in-their-awkward-adolescence.html?_r=0\">awkward adolescence</a>.” That is because they are in their gestation phase. Like PCs in the 70s, the internet in the 80s, and smartphones in the early 2000s, we are seeing pieces of a future that isn’t quite here. But the future is coming: markets go up and down, and excitement ebbs and flows, but computing technology marches steadily forward.</p>",
"image": null,
"media": [],
"authors": [],
"categories": []
}
]
}