Follow BigDATAwire:

February 4, 2021

Hitching a Ride to the Edge with Akamai

(leolintang/Shutterstock)

The edge, according to marketers, is a magical place where nearly anything is possible. But for Akamai Technologies, the $18-billion company that put content delivery networks (CDNs) on the map during the dot-com boom, the rise of edge computing and 5G networks marks a significant shift in how data and business logic will be distributed and consumed in the future.

Datanami’s recent conversation with Ari Weil, Akamai’s global VP of product and industry marketing, started innocently enough with a question that seemed perfectly reasonable at the time: What is the edge? Based on Weil’s answer, the edge actually may be a can of worms.

“Well, I’ll say honestly, it depends on who you ask,” Weil admits. “There’s multiple edges, which confuses the hell out of people.”

A retail bank, for example, may consider its headquarters to be its core. But if it serves customers there, then it functions as an edge location. Cell phone operators say their 5G cell stations are part of their edge. HPE and Dell are building edge servers and mini-data centers, while cloud providers are calling any small data center that’s not part of their cloud the edge.

(From an etymological standpoint, we’re just grateful Weil didn’t mention men’s shaving products, Microsoft browsers, or English-born musicians in Irish rock bands.)

For the sake of discussion, Weil settles on this definition: “The edge is where you get as close as possible to wherever something is consuming information or data or applying a decision.”

Thankfully, that meshes quite well with our preconceived notions of edge-ness. And while this definition may sound like a rather pedestrian thing, it, in fact, has large implications for how companies will build information systems going forward.

An Early Edge

-Tada-Images.

Akamai’s Santa Clara office is an edge location (Tada-Images/Shutterstock)

Akamai has been building edge solutions for over 20 years, even if it wasn’t broadly called “edge” back then. (It turns out, Akamai co-founders Dan Lewin and F. Thompson Leighton did call what they were doing “edge computing” in the late 1990s and early 2000s, Weil says, but nobody was paying attention.)

Akamai’s early edge was a global network consisting of tens of thousands of servers and storage repositories pre-cached with customers’ static content, thereby ensuring fast downloads for consumers around the globe. Today, Akamai’s edge includes nearly a third of million servers, and is within one network hop of 85% of the world’s Internet users. It delivers 100 terabits of bandwidth per second, and functions, in some ways, like a parallel Internet, offloading repetitious movement of redundant bytes.

The modern edge is a new twist on that CDN of old, Weil says.

“When CDNs got started, everything we did was from inside out,” Weil says. “You were taking stuff from the data center or a cloud location and you were just making copies of it and then holding it closer for users to consume. We do that with static object or images or video files or what have you.

“What’s changed is now we’re saying we want to have the ability for the edge to make decisions,” he continues. “So it’s not just holding data. Now we’re taking logic and shifting it out there.”

Edgy Use Cases

Akamai’s edge today leverages that massive infrastructure to put clients’ applications in very close proximity to their customers. “You have to be massively distributed, and, let’s say, within five miles of the consumer to claim any sort of true edge,” Weil says. “Otherwise, you’re a regional data center.”

Akamai helps automakers deliver firmware to cars on the go (SP-Photo/Shutterstock)

Not every application should be moved to the edge, obviously. Training a large machine learning model, for example, is probably best done in the cloud. But any workloads that benefit from lower latencies and near-instantaneous customer interaction–including machine learning inference and other forms of decision-making–can gain advantages by moving to the edge.

Connected cars is the classic example of the power of the edge, Weil says. “We work with major automotive makers for things like firmware updates, sending out critical patches,” he says. “We can make sure the software downloads actually make it to the vehicle.”

Another use case involves reduction of fraud in event ticketing. Crafty concert-goers figured out that, if they attempted to enter a concert from different entrances, they could all get in by making multiple copies of a single valid ticket. By linking the ticket scanners to the Akamai edge, the concert host can detect those duplicate copies. “Last year was not the best for that use case,” Weil adds.

Similarly, airlines passengers can be frustrated when information about flights are different on mobile apps, airport kiosks, and at the gates themselves. By pushing out updates over a standard protocol (MQTT), Akamai can make sure passengers are getting consistent information. “There are a couple of cool use case like that that you can’t do with normal Web-based applications and protocols,” he says.

Cloud Fifth?

Many companies say they’re taking a “cloud first” strategy to their IT systems. Especially during COVID-19, this seamed like a smart thing to say. As the tremendous latency advantages of 5G connectivity expand the edge’s surface area, they may want to rethink that cloud-first approach.

Cloud-first is being replaced with edge-first (amgun/Shutterstock)

Analysts calculate that edge spending will grow around 25% to 35% on a CAGR basis for the next few years, ending up as a $20 billion space. Weil thinks that number is too small, and only factors in novel edge workloads, not how existing workloads will adapt to the new technological reality.

“Between now and 2024, most business are coming to grips with what they should have in the cloud and what they should have in the edge, and they’re establishing reference architectures,” Weil says. “Right now, there is no standard for edge computing, because people are figuring it out.”

Organizations that invest in microservices architectures and API-enable their applications will be able to push consumer-facing business logic out to the edge. As 5G comes online, it will become feasible for companies to pre-cache large amounts of data, enabling very rapid decision-making in the field.

“Instead of going back and forth to my centralized database in cloud, I’ll take a big result set in JSON or XML and I’ll have it out the edge and actually use edge computing logic to just query that file,” Weil says. “That way, I don’t have to go all the way back [to the cloud]. It’s a lot less expensive, I don’t incur the network transit, and my user ends up having a better experience because I was basically able to cut out the speed-of-light limitations that I had before.”

Coupled with big data and machine learning advancements, the edge will open up a whole new world of instant personalization in the real world. Not every use case can benefit from a latency reduction from half a second to 5 milliseconds, which is what Akamai is seeing with its platform. But the reduction in latency (as well as an increase in resiliency) is driving companies across nearly all industries to look for potential use cases that can leverage this emerging capability.

For Akamai, which has already made the investment in edge computing, the potential is vast. But it’s dependent to some extent on getting that 5G signal in the air.

“It’s always been that problem of the middle mile, where they just haven’t been able to get as much through,” Weil says. “We have huge capacity through our edge network–more than people can take advantage of for years. 5G is finally going to allow businesses to take advantage of that capacity.”

Related Items:

2021 Predictions from the Edge and IoT

Openness a Big Advantage as Edge Grows, IBM Says

Exploring Artificial Intelligence at the Edge

 

 

BigDATAwire