The Cranky Admin

Edge Computing and Big Brother

They're watching you.

According to some of our industry's prognosticators, the real innovation of the 2020s won't occur in the cloud, it will occur on the edge. The edge is, however, merely an extension of the cloud; the time-sensitive, proximate extrusion of the cloud into our daily lives.

The cloud isn't what's going to enable the super-creepy Minority Report future. The edge is. Let's consider the identification and tracking of an individual within a retail store as a great example of how edge computing and cloud computing will likely work together to handle decision-making in tiers.

The Store
When you first walk into the store, a high-resolution camera captures your face. This image is sent to the local facial recognition datacenter. This datacenter does nothing more than scan the face for a match against existing customers of that retail store. If the customer is known, a greeting is emitted (Hello, Mr Yakamoto!)

This all has to happen with relatively low latency, as you want to greet the customer before they're fully in the store. You only check against customers the store already knows due to privacy regulations, and because the cost of lookups against the semi-legal grey-area databases is high. Those databases are huge, and they require resources that don't exist locally.

The image is also submitted via API to the cloud service of both the security and insurance companies, both of whom will look up the face against all known databases and alert authorities if the individual is deemed suspicious. They have special dispensation to access the larger face registers and a legal obligation to report if anyone looks out of place.

Meanwhile, as you take your next few steps into the store, a series of low-resolution cameras track your position. Infrared cameras and millimeter-wave scanners check you for signs of aggression, depression, weapons and so forth.

A local edge node installed within the store handles the simple tasks of tracking your movement throughout the store using the data from the low-resolution cameras. It streams metadata about your movements back to the cloud, where it’s stored in a data warehouse for later analysis.

A batch job will run when the cost of CPU time dips below a pre-set limit that calculates which displays drew your attention the quickest, for the longest time and which ultimately drove you to purchase. This data will be compared against that of other customers to see if there are any similarities based on age, income, gender, race, other identifying characteristics that drive purchases of certain things, or cause dismissal of others. This data will then be pseudoanonymized and then sold back to the manufacturers of the products in question.

Sensors will monitor you for electromagnetic signals, and these will be analyzed by the site-local edge node. Did you receive a text message? Did you answer it? Was there a data burst that looked like an instant message, or perhaps a phone call?

If you speak, audio sensors will monitor words you say, looking for keywords. Because it's illegal to actually record your conversations, only incidences of certain keywords will be included in metadata. Eye-tracking sensors built into displays will track which individual items catch your eye. This too will be reduced to metadata by the site-local edge node, and all metadata will be streamed in real time to the city-local edge datacenter for latency-sensitive processing.

Advertising displays within your field of vision will respond to what you look at and display sale prices or offer personalized incentives to purchase items which appeared to interest you. The success or failure of these offerings will be recorded. Emotion tracking systems will identify your interest or disgust at the offers presented, and the metadata from that will also be streamed to the cloud for later analysis.

The Hybrid Edge
The retail store is absolutely bristling with purpose-built sensors, all of which do something very complicated to track you or something about you. Some of these sensors will have basic compute capabilities built into the device. Some will simply stream raw data back to an edge node installed onsite at the retail location.

Some rare sensors (such as the initial facial recognition) will require access to compute or storage capacity beyond what could be reasonably installed on site and will call out to proximate but still relatively small edge datacenters. Metadata from both tiers of edge device -- the site-local node and the city-local datacenter -- will be shipped up to the large public cloud datacenters for time-insensitive analytics.

From the retail store's point of view, none of this matters. They simply signed up for a service. A site-local node was shipped to them along with some sensors. All their latest product displays came with other sensors. They installed a wireless gateway near the till, plugged the edge node into the router in the back, and all the data they need to know is delivered to the store's tablet app.

This is the edge in practice: tracking you through retail stores, hospitals, schools and prisons. Watching what you eat and what you buy, and making sure that there is compute power close enough to you to react to what you are doing in real time.

Welcome to the future.

About the Author

Trevor Pott is a full-time nerd from Edmonton, Alberta, Canada. He splits his time between systems administration, technology writing, and consulting. As a consultant he helps Silicon Valley startups better understand systems administrators and how to sell to them.

Featured

Subscribe on YouTube