“On one hand, information wants to be expensive, because it’s so valuable. The right information in the right place just changes your life. On the other hand, information wants to be free, because the cost of getting it out is getting lower and lower all the time. So you have these two fighting against each other.”
– Stewart Brand (Quoted in Levy Hackers, p. 360)
Debates over AI have dominated the first half of 2023. The US Congress held hearings, and regulation has been demanded from many quarters. This kind of furor is not a new thing. We have been debating similar regulations for social media for years now.
The technology community has often reacted to these debates with a mixture of fear and derision. If legislators who don’t understand the difference between Twitter and Wi-Fi were to create regulations about social media or AI algorithms, the results are likely to be hilarious but also harmful to innovation.
The speed of politics is also out of sync with the speed of technological change. Technology moves so fast and most political systems are slow by design. It is hard to imagine regulations that aren’t outdated even before they are discussed in a congressional committee, much less implemented.
Even enforcing existing regulations is a challenge for bureaucracies and law-enforcement. Microsoft was punished for Internet Explorer long after the questions that prompted the original antitrust lawsuit were irrelevant.
It’s important to step back from technology and ask more fundamental questions about what’s really going on here and where the roots of potential abuses lie. We live in a complex world with lots of moving pieces and rapidly shifting environments. It’s often difficult to see the core issues through the confusing noise. Ironically, it is here that an AI designed around information legitimacy could be transformative.
Transparency should be a cornerstone of all technology regulations. A recurring theme I hear in the AI debate is that we don’t know what’s going on. Some of that is internal to the technology itself (which doesn’t make it undiscoverable), but much of that is hidden behind the veil of corporate secrets.
In the AI world, almost no one understands how ChatGPT gets from a query to answer because OpenAI locks almost all that process in a proprietary box. In the social media world, the same is true for the algorithms that bias newsfeeds on Facebook and Twitter.
My dog is smart enough to understand that if something interesting is going on underneath a blanket, the solution is to pull the blanket off. It’s high time we pulled the blanket off these processes.
Ignorance is a way to make money. The digital age has threatened this tried-and-true practice. If you don’t know that you can get a product down the street for less money, I can sell you that same product at an increased profit to myself. If you can access all this information on your computer or phone, it undermines my profit potential.
Over the last 30 years, economic actors have had to adjust to this reality. The solutions have ranged from adding layers of complexity to the product, to make it difficult to compare to other products to creating proprietary black boxes that conceal some sort of “secret sauce.“
Much of today’s industry, and I include the tech industry in this, operates behind a hall of mirrors as a way of protecting profit. Complexity has replaced scarcity as a profit screen. The practice of deception hasn’t changed.
As those of you who have read my work know, I am a vigorous proponent of technology as a creativity enabler. When I spend half my time dodging around unseen obstacles in various platforms as I try to create, I waste valuable creative time.
Imagine being a regulator, trying to decode the code that drives these processes. AI could streamline decoding complexity. It’s good at looking for patterns and connections.
As I discussed in a previous blog, AI is a powerful tool, but it needs to be supported by open systems in order to fulfill its potential. Systems of openness are a key to keeping AI systems firmly in check.
Those who say that innovation will be crippled if we create open systems haven’t read the literature on what drives innovation. Generative AI is also showing that hiding stuff is a fool’s errand as AI crawlers find their way into more and more systems.
There are plenty of ways to generate profit from open systems. As systems become more complex, even if they are open, people will need help to maximize their own usage of technology. This is a far deeper well of profit because they can tie it to productivity and growth, not a temporary state of ignorance.
Once you lock something up, it stagnates. While you may make a marginal profit at the beginning by having a technology that no one else has, hiding it is not a sustainable profit model.
Transparency is evergreen. You don’t have to make special regulations for this technology or that technology. You just insist that all technologies and businesses follow clear and open rules.
The job of the regulator becomes much simpler and focused when it is targeted at opening doors to public scrutiny. Societies can enforce existing regulations much more easily because people can figure out what’s going on. Transparency smashes the hall of mirrors.
We need to use the technologies that currently exist and are being developed to police the technologies of the future. For instance, we can design AIs to look for patterns of suspicious activity in technological systems.
I’m not talking about policing the users. I’m talking about policing the algorithms. AIs could investigate companies suspected of nefarious practices.
These kinds of investigations could gradually reshape a destructive paradigm of exploitation if they are part of a culture of public watchfulness and are legally protected. This runs against current business culture.
Transparency represents a paradigm shift for American business. However, in the long run, it is a more profitable strategy for success.
This strategy of technology development gives open societies a competitive advantage over economies that refuse to follow suit. If we’re worried about AI competition from China, the best way to win that competition is to leverage advantages that closed societies find difficult to replicate. This is how we won the Cold War. This is how we can win going forward.
Transparency also represents a rallying cause for proponents of effective regulation. Right now, it’s easy, or at least it seems to be easy, for people to understand what they’re against, but very difficult to understand what they are for.
As a student of systems thinking, I understand how this challenges certain paradigms. Those paradigms are already being challenged by the march of technology.
Stewart Brand’s quote that leads off this blog is not inaccurate. Just because information wants to be free doesn’t mean it is.
We need to stop making a political prisoner out of information or a revolution will occur in which that prisoner reeks revenge on an unprepared society. It is humans and the systems that they create that pose the real danger in a world of technological amplification.
Open is progress. Closed stagnates. Let’s choose the open path.
This is the first in a series of blogs on the power of transparency in technology.