The Cranky Admin
Artificial Intelligence-as-a-Service and the Bad Guys
The future of bulk data computational analysis tools and the eternal struggle of good vs. evil.
As Artificial Intelligence (AI), machine learning and Big Data analytics mature, these technologies will inevitably be used more widely. Increased adoption brings demands for oversight, restriction, regulation and bans. All of this moral outrage is not only completely pointless, it's shockingly dangerous.
Anyone today can get access to powerful AI tools as easily as they would access any other public cloud service. IBM's Watson is available as a service, Amazon is ramping up its machine learning offerings, and Microsoft has some powerful Big Data pushes of its own.
Even without bringing in Google -- which is already well on its way toward creating Skynet -- there are some pretty powerful tools available to the everyman today. And today is still early days when it comes to all of the above technologies. Next-generation security companies are collectively throwing uncountable hours at similar efforts. Governments and academics are devoting unimaginable resources to projects both benign and terrible.
Denial Is Not Just a River
Those who demand regulation and restriction on bulk data computational analysis technologies seem to be under the misapprehension that prohibition works. I can think of no instance in human history where prohibition has ever worked, but I can list examples where prohibition has gone horribly wrong for months.
The bulk data computational analysis genie is out of the bottle; and like fire, the combustion engine or atomic energy before it, there is absolutely no way you can stuff it back in. This simply reality -- that you can't defeat an idea -- is why attempts at manufacturing outrage about the capabilities of our evil AI-wielding overlords is dangerous.
Put simply: prohibition only ever serves to frustrate and punish law-abiding individuals. As an example, I present the failure of the entertainment industry regarding DRM. Or the farce that is the war on drugs. Those seeking to take advantage of or harm others won't be dissuaded by laws, regardless of how much hand-wringing went into their creation.
More to the point, cracking down on the use of bulk data computational analysis tools deprives the good guys of the very tools they'll need. You can't defeat an idea, and bulk data computational analysis is a very useful tool for the bad guys. They are using it. They will continue to use it, now and forever, and there is absolutely nothing any of us can do to stop them.
Shields Up
Once we've collectively made the requisite mental leap to accept that bulk data computational analysis isn't ever going away, we need to start developing defenses against it. Sadly, as with most things where humans are involved, social progress -- like science -- advances one funeral at a time.
We probably won't collectively be ready to admit that the war against bulk data computational analysis is unwinnable until most of the people who are alive today die. Our children (or perhaps their children), who will have grown up never knowing a world without bulk data computational analysis, will be the first ones to collectively admit prohibition is impossible. This places those developing the next 50 years of IT, social and even technologies that prevent you from being tracked or identified by automated systems in the physical world in a very unfortunate position.
As the calls for prohibition get louder and louder, those working in the security sector will be increasingly operating in a grey area. Their livelihoods -- and in some cases, perhaps even their lives -- will be under threat. Already we see this the beginnings of this as white and grey hat hackers are denied entry to countries hosting security conferences, or even arrested at the border.
The practical effects of the above have yet to manifest themselves. In the next decade we will place organizations of all sizes in the awkward position of legally requiring them to defend our data against all attacks, while also making it illegal to obtain -- let alone use -- the tools or expertise required to actually do so.
Falling Behind
If history repeats -- as it usually does -- then research into prohibited technologies will quickly become next to impossible. Consider as an example the difficulties that have surrounded any attempt at scientific study of psychedelics for medical purposes.
Because of their use in the 60s and 70s by the counterculture movement as recreational drugs, getting permission to do even the most basic research into psychedelics has been stymied by bureaucratic nonsense for decades. Recent studies into both psilocybin and LSD both show that there may well be clinical benefits to these substances at dosages below recreational levels. Unfortunately, even with cautiously optimistic preliminary data indicating that further research is justified, getting permission to do so isn't easy.
When trials are approved, the researchers and their institutions come under immediate, vicious and unrelenting attack. Because psychedelics are the devil, it's important to stamp out any hint they could under any circumstances be used for anything good. This typically involved tirades in certain media outlets that use circular reasoning involving the illegality of the substance.
There is a large segment of humanity in any generation that simply aren't fans of evidence-based legislation.
The problem with stigmatizing bulk data computational analysis technologies to the point of prohibition and trigging a generation or two of armchair crusade is quite simply that, while we're busy fighting a war against the perpetually outraged, the capabilities of the defenders will fall farther and farther behind. Unlike conventional warfare, there are no nuclear weapons in the digital theatre. There is no mutually assured destruction anywhere on the horizon.
We are locked in a perpetual cold war against an unknown number of aggressors with unknown capabilities, unknown agendas and unknown support infrastructure. We know that some of our opponents are factions of our own governments. We know that many of our opponents are sponsored by other governments, major corporations and organized crime. Trust is a rare and exceptionally valuable commodity, and the battlefield shifts almost daily.
Prepare for the Dark Ages
Unfortunately, it's highly unlikely that we'll be able to avoid a period of prohibition around bulk data computational analysis technologies. It's as silly to deny the lessons of history regarding human nature there as it is for the denialists to try to put the genie back in the bottle.
In response, we must start to alter the design of our networks and our applications. The day is coming soon when we will be forced by law to treat data from one group of individuals different from another. We may be banned from using bulk data computational analysis technologies to protect the data of individuals from one nation, with strict enforcement.
We may be banned from using those same technologies to protect data of individuals from another nation, but with carefully absent enforcement because another set of laws demands that we, in fact, employ them. Still other nations will be more progressive and require that companies use the latest security technologies available, regardless of whether their use has been banned elsewhere.
This is not some far-off future we're talking about; I suspect we'll see the first examples of this dichotomy appear as a practical problem facing global organizations before the end of 2020. Given the continuing integration and digitization of the economies around the world, we may all be in the thick of it by 2025.
That's not a long time. Changing applications and infrastructure designs to allow for segregation of data processing and storing based on the origin of the Data Subject is difficult and time consuming. It's time we start this now. By the time organizations start receiving multi-million dollar fines for not simultaneously obeying and disobeying prohibitions on bulk data computational analysis technologies, it will be too late.
About the Author
Trevor Pott is a full-time nerd from Edmonton, Alberta, Canada. He splits his time between systems administration, technology writing, and consulting. As a consultant he helps Silicon Valley startups better understand systems administrators and how to sell to them.