SITALWeek #383
Welcome to Stuff I Thought About Last Week, a personal collection of topics on tech, innovation, science, the digital economic transition, the finance industry, and whatever else made me think last week.
Click HERE to SIGN UP for SITALWeek’s Sunday Email.
In today’s post: robots unloading trucks as humans stand by; off-the-shelf parts continue to fuel the consumerization of military weapons; ChatGPT has reached 100M monthly users as LLMs branch out into new sectors; the value in paying content creators; the regulatory risk of data clean rooms; the simple demographic forces of the next two decades; and, a deeper look at creativity in the AI Age.
Stuff about Innovation and Technology
Stretch ArmBot
Boston Dynamics' Stretch robot is being used by DHL to unload trucks. The pick-and-place robot arm has a vacuum suction “hand” that quickly unloads under the supervision of humans. The marketing video released by Boston Dynamics is filled with DHL workers praising their new helper. One worker anthropomorphizes the robot, describing how it "reacts" when it drops a box. There is some dissonance knowing that some of the workers in the video will eventually lose jobs to tools like the Stretch robot. While there is much talk of advancements in AI for pure software applications, the combination of AI and robotics is likely to yield innovations that are just as significant. Regardless of what your job is, you are (or soon will be) training an AI/robot to replace or assist you – knowingly or not.
Weaponization of Consumer Electronics
Garmin’s GPS receiver isn’t designed for military use, but it’s being widely used in mass-market drones for military purposes, along with several other chips and components not intended to fuel foreign adversaries’ arsenals. As a result, the TB2 drone made in Turkey comes at a cost of around $5M, far less than the US' $28M Predator drone. The consumerization of military weapons is something governments and companies need to collaboratively short circuit. I wrote about this important and easily solvable issue in more detail in Chip-Fueled War.
Prime Physicians
AI functions well as a sort of pre-filter, or first step, before handing off a more complex task to human experts. Last week, I got an email advertising a new healthcare service called Amazon Clinic. While I haven’t tried it yet, it appears to use a chatbot front end to ask questions and gather information regarding a number of health issues for review by a clinician, who can then design a treatment plan and write prescriptions – no visit or video call with the doctor required. As I discussed in DoctorGPT, Google’s Med-PaLM model can give correct medical answers more than 90% of the time – matching humans – which suggests it won’t be long before the clinician graduates to AI supervisor.
Polyglot LLMs
I’ve previously described AI tools like ChatGPT as translators that allow us to instantly speak new languages – e.g., we can now have a conversation with data. This advance is possible because LLMs function by taking something (e.g., numbers, images, or structures) and representing it in words. This Arstechnica article is a great overview of the history and technology of transformer models. Generative AI tools also allow us to think in languages unique to particular fields of study. Biology is an especially ripe field for this application, with much current research seeking to discover proteins with novel or enhanced functionality, e.g., to treat various diseases or decompose plastics. The models have to learn to decode and then recreate the functional language of proteins’ complex 3D structures that dictate their activities. One such example is ProGen, created by Salesforce’s AI research division. Google’s DeepMind is also seeing rapid adoption of AlphaFold (also based on a transformer model like ProGen), with this site showcasing many of the novel research uses. By far the biggest use case for LLMs remains ChatGPT, which set a record for the fastest growing app ever, reaching an estimated 100M monthly users within two months of its public debut. Microsoft is set to integrate the next-gen version, GPT4, with Bing search, as well as embed ChatGPT within Microsoft Teams for note taking and meeting recaps. OpenAI founder Sam Altman discussed the potential for ChatGPT to eclipse search – and for artificial general intelligences (AGIs) to “break capitalism”. I’ve become accustomed to writing SITALWeek with a ChatGPT window open. I can have several conversations going at once, and the productivity boost is impressive. I can pick conversations back up where I left off, with the same context, for whatever topic I am working on. In a short period of time, it’s getting hard to remember how I functioned without ChatGPT.
Platform-Creator Tension
In #357, I discussed the importance of non-zero sumness as it relates to social networks. I argued that YouTube, which has long shared revenues with its creators, has an advantage in attracting content creators, thereby increasing content value and platform engagement – and insulating itself from disruption by competitors. Recently, top YouTube creator MrBeast echoed similar comments on Lex Fridman’s podcast. The FT reports on the rapid rise of YouTube Shorts to challenge TikTok and Instagram’s Reels copycat product. Meta has long been opposed to paying creators on their platforms, treating user-generated content as a commodity to be exploited solely for the gain of Zuckerberg and his shareholders. The Information details Meta’s internal debate as to whether or not they should actually pay people for the content, without which Meta would be worthless. Former COO Sheryl Sandberg was apparently the biggest opponent to paying the people who power Meta. To not value the content on your app is to not understand what creates a sustainable, multi-sided platform. The right move today is to compete to pay creators the most. Careful readers may scent some hypocrisy here, given that I have previously argued that content is becoming an infinite commodity. However, these concepts are not mutually exclusive, as we can reward creators based on the value of their content, allowing the gems that delight viewers and power revenues to emerge from the sea of flotsam.
Dirty Clean Rooms
Companies are increasingly using algorithm-driven tools to price their products. Last fall, I wrote about the negative externalities of this practice when applied to the apartment rental market. Recently, several Las Vegas Strip hotels have been accused of price fixing through the common usage of one pricing service provider. Separate from these tools, which seemingly allow behind-the-scenes collusion by having broad access to data across multiple competitors, companies themselves are increasingly utilizing cloud-based “clean room” databases to anonymously share data. It’s been a rising trend in the ad industry as Apple and various regulators crack down on user privacy. Microsoft is rumored to be working on a clean room product for its Azure customers to share data with each other, according to The Information, following similar efforts by AWS and Google Cloud. The story notes that the new Microsoft service “will let two or more entities search the data, apply machine-learning models to the pooled data and get results back, all without being able to see the other parties’ data.” However, AI can in some cases successfully de-anonymize data to identify specific people. You can imagine scenarios where such data are used by insurance companies or financial institutions to price risk in a supposedly anonymous way, while they are actually linking customers with their personal information. Likely, these clean room machine learning products merit a closer look by regulators because there are no frameworks for how data can be used collectively without the risk of collusion and/or discriminatory pricing for products and services.
Miscellaneous Stuff
The journey may be weary
But I'll keep marching on
With a fire in my soul
And the will to carry on
-From ChatGPT prompt: Write a song lyric in the style of Nick Cave about moving on to the next hard task
Longtime readers know that I am a huge Nick Cave fan, and I am also fond of quoting from his email Q&A with fans called The Red Hand Files. A couple of weeks ago, Cave responded to multiple inquiries about ChatGPT’s ability to write song lyrics in his style. His reaction was violent: “this song is bullshit, a grotesque mockery of what it is to be human, and, well, I don’t much like it — although, hang on!, rereading it, there is a line in there that speaks to me — ‘I’ve got the fire of hell in my eyes’ — says the song ‘in the style of Nick Cave’, and that’s kind of true. I have got the fire of hell in my eyes – and it’s ChatGPT.”
Hopefully Cave doesn’t learn about the new MusicLM from Google, which can create an accompanying musical track based on a text description. Cave describes songwriting (and perhaps many other creative endeavors) as “the breathless confrontation with one’s vulnerability, one’s perilousness, one’s smallness, pitted against a sense of sudden shocking discovery; it is the redemptive artistic act that stirs the heart of the listener, where the listener recognizes in the inner workings of the song their own blood, their own struggle, their own suffering. This is what we humble humans can offer, that AI can only mimic, the transcendent journey of the artist that forever grapples with his or her own shortcomings. This is where human genius resides, deeply embedded within, yet reaching beyond, those limitations.”
Now, I am about to say something that will be very unpopular (possibly resulting in a few hate-unsubscribes that take place every so often here at SITALWeek): I thought the ChatGPT song lyrics in totality were not that easy to tell apart from Cave’s recent work without careful study of their contents (the tipoff is that the ChatGPT lyrics make a little bit too much sense and leave little open to interpretation; for similar reasons, the lyric snippet from ChatGPT I posted to start this section is recognizable as an AI fabrication). I feel a little regret saying that AI could replicate Nick Cave, so let me explain. I think the following two statements are both true: 1) Cave is a genius, and 2) Cave’s work can sometimes feel derivative of itself, making it susceptible for AI to crack and translate into new songs. This is not a criticism of Cave — his brand of lyricism is what makes his songs so powerful. However, this question of mimicry at is the heart of the point I have been making about AI for some time now: AI can shine a spotlight on what humans do that is no longer singularly the domain of humans. Rather than rage against the machine in a fit of defensive anger (that likely couches a subconscious knowing that AI can indeed replace us for most of what we do), another option is to step back and endeavor to see what we might do next that AI won’t crack so quickly. Perhaps it’s a fool’s errand given the pace of progression in AI, but I think it’s worth a shot. I am reminded of a quote from Kevin Kelly I posted in #372: “Instead of fearing AI, we are better served thinking about what it teaches us. And the most important thing AI image generators teach us is this: Creativity is not some supernatural force. It is something that can be synthesized, amplified, and manipulated. It turns out that we didn’t need to achieve intelligence in order to hatch creativity. Creativity is more elemental than we thought. It is independent of consciousness. We can generate creativity in something as dumb as a deep learning neural net. Massive data plus pattern recognition algorithms seems sufficient to engineer a process that will surprise and aid us without ceasing...For the first time in history, humans can conjure up everyday acts of creativity on demand, in real time, at scale, for cheap. Synthetic creativity is a commodity now. Ancient philosophers will turn in their graves, but it turns out that to make creativity—to generate something new—all you need is the right code.”
I would be on the edge of my seat if Nick and Warren’s next album were a collaboration with AI. At the same time, I appreciate the role of the skeptical artist-curmudgeon. There is room for both paths in this world, but one of them may prove to be far more creative and interesting, while showcasing humans’ incredible adaptability as individuals and as a species. As I mentioned above, I’ve begun to rely on ChatGPT when writing SITALWeek. While ChatGPT is not yet writing the text (as far as you know), I keep several conversations open as I write. Last week’s popular essay on how to ask better questions was uniquely from my own crazy human brain (as far as I know), but I did rely on ChatGPT to refresh my memory on Socrates and the Sophists. I even asked it to have a Socratic-style debate with me, which was delightful. The point is that we need to quickly adapt to using these powerful new tools because they can push us to take our creativity to an entirely new level. When I reflect on Cave’s commentary on songwriting, he seems to place the most value in the emotional (and physical) difficulty of the journey – to be a good song, it must metaphorically contain your blood, sweat, and tears. However, just because something gets easier, it doesn’t need to lose its intrinsic value. Rather, it means we can more readily advance to the next challenging creative endeavor, likewise requiring blood, sweat, and tears, that will allow us to express and defend our humanness.
Stuff about Demographics, the Economy, and the Finance Industry
Demographically Shrinking Taxpayer Base
This recap of demographics from EPB Research has some great charts showing the key concepts everyone should be focused on for the economy over the next one to two decades. The combination of an aging population and a declining workforce has a wide ranging set of potential tug-of-wars between deflationary consumption trends (declining consumption as population ages) and inflationary labor trends (fewer workers), which will come into and out of importance sector by sector. When you look at the Stretch robot with which I opened this week’s newsletter, it’s fairly easy to see deflation winning out long term. One wild card is that, as the working-age population shrinks as a percent of the total population, it creates a big drag on tax payments to fund governments and retirees. If robots and AI are also increasingly taking over white and blue collar jobs, it could necessitate a robot “payroll” tax to fund the government.
✌️-Brad
Disclaimers:
The content of this newsletter is my personal opinion as of the date published and is subject to change without notice and may not reflect the opinion of NZS Capital, LLC. This newsletter is an informal gathering of topics I’ve recently read and thought about. I will sometimes state things in the newsletter that contradict my own views in order to provoke debate. Often I try to make jokes, and they aren’t very funny – sorry.
I may include links to third-party websites as a convenience, and the inclusion of such links does not imply any endorsement, approval, investigation, verification or monitoring by NZS Capital, LLC. If you choose to visit the linked sites, you do so at your own risk, and you will be subject to such sites' terms of use and privacy policies, over which NZS Capital, LLC has no control. In no event will NZS Capital, LLC be responsible for any information or content within the linked sites or your use of the linked sites.
Nothing in this newsletter should be construed as investment advice. The information contained herein is only as current as of the date indicated and may be superseded by subsequent market events or for other reasons. There is no guarantee that the information supplied is accurate, complete, or timely. Past performance is not a guarantee of future results.
Investing involves risk, including the possible loss of principal and fluctuation of value. Nothing contained in this newsletter is an offer to sell or solicit any investment services or securities. Initial Public Offerings (IPOs) are highly speculative investments and may be subject to lower liquidity and greater volatility. Special risks associated with IPOs include limited operating history, unseasoned trading, high turnover and non-repeatable performance.