
This video can't play due to privacy settings
To change your settings, select the "Cookie Preferences" link in the footer and opt in to "Advertising Cookies."
Edge Computing Covered and Diced
What is edge computing, and what does it mean for data workloads, latency, and our precious, precious bandwidth? Red Hat CTO Chris Wright speaks with Azhar Sayeed, Red Hat Senior Director, Global Telco Technical Development, about 5G and how edge is like an onion. And Red Hat Senior Director of Technology Strategy Nick Barcet calls in from the high seas to chit-chat about what edge means for the future. Join us as we peel into the topic and cover, dice (and smother) edge computing.
Transcript
Transcript
00:00 - Azhar SayeedI mean, while 4G can handle up to 4,000 devices per kilometer, with 5G, it's supposed to be insanely large. Almost up to a million devices per kilometer. Up to 20 gig per second per user.(connecting breaking)
00:15 - Chris Wright
Azhar, Azhar, you're breaking up. Looks like we lost Azhar. Today this seems like the norm. It seems like our networks are never fast enough but many of us actually take our high-speed data for granted. And it wasn't so long ago that we were still dialing up at a blazing 56k per second, and our favorite streaming or ride-share apps just weren't possible. Our data workloads are exploding and as we move our storage and compute out further to the edge, we face the next wave of innovation that will have a profound impact on our lives. But what is edge? And why is it so exciting?
00:53 - INTRO ANIMATION
01:04 - Chris Wright
Edge is a distributed computing architecture that brings compute closer to the things and the people that produce and use data. This proximity to data can allow for the real-time processing and decision-making that's going to fundamentally change how businesses and consumers use and interact with data. When you bring compute closer to data, this will inherently cut down on latency.(typing)Latency is the time it takes for data to travel from node to node. Latency is dependent on the physical distance that data must travel through cords and networks and the like to reach its destination. Or from user experience perspective, latency is perceived as a time delay between the stimulus and response when compared to a real-world equivalent.(computer dinging)Oh, hey, Azhar, welcome back.
01:51 - Azhar Sayeed
Sorry about that, Chris. Don't know what happened there.
01:54 - Chris Wright
It's cool. It got me on the topic of latency and edge compute.
01:58 - Azhar Sayeed
Oh, do you want me to explain the onion?
02:01 - Chris Wright
You just have that at your desk?
02:04 - Azhar Sayeed
Well, actually it's a good metaphor. It helps us visualize the edge architecture for telcos. As we peel each layer off this particular onion, we can find the next layer and then the next layer is another edge.
02:15 - Chris Wright
Totally, you know, we often describe the edge like an onion but actually, I have an animation that might be a bit easier.(animation begins)In a traditional network, compute was centralized with the data center at the core but with faster network infrastructure and growing workloads, we need to move computing capacity out of data centers and closer to where the action is. Each tier of the network has layers of edge, enterprise edge, the provider edge, the end-user premises edge. The network has an edge and a set of network-specific applications that power it. Edge means different things to different people.(door knocking)And ultimately...
02:49 - Azhar's Son
Dad, the Wi-Fi's not working.
02:51 - Azhar Sayeed
How many times I've told you, it's not our network? Sorry, Chris. He plays Fortnite and every time there's an issue, he blames our network for it.
03:03 - Chris Wright
Of course.
03:04 - Azhar's Son
Dad.
03:06 - Azhar Sayeed
I've told you about the onion. Sorry, Chris, I need to run.
03:09 - Chris Wright
No worries, got it. Good luck.
03:11 - Azhar Sayeed
Thanks.
03:21 - Nick Barcet
Hi Chris, what's up?
03:22 - Chris Wright
One of the things I've noticed about edge computing is it gets closer to the core of our customers' businesses, so the use cases I think are more industry-specific. I just wanna check in with you. What kind of use cases are you seeing?
03:37 - Nick Barcet
We are seeing a wide variety of use cases. There are some industrial use cases where we need to do predictive maintenance on some equipment or where we need to enforce worker safety or we have health use cases where we help with diagnosis.
03:57 - Chris Wright
Or work on a common platform, that kind of view of the world and where data and probably even AI and machine learning becomes some of the common tools but solving very different problems.
04:09 - Nick Barcet
What's really important here is the fact that we are generating more data than we can flow directly to a data center. You know how when you're building a website, we're saying that if the page will take more than two seconds to load, people are going to get discouraged? Here we are talking about processes that maybe discourage after a few milliseconds. So if we need fast response time, we cannot afford sending it to a central data center and waiting for the answer to get back. We also have a problem of scale. What was possible for the past 10 years when we were accumulating data centrally is very, very quickly becoming impossible to maintain.
04:53 - Chris Wright
I think of edge computing as distributed systems on steroids. I mean, we're really talking about a large scale of distributed computing.
05:01 - Nick Barcet
We cannot scale the number of humans linearly with the number of sites we are deploying onto. So we need a lot of automation to cover for that. Interoperability is linked to the fact that each layer of our edge may have different kinds of hardware, hardware that you own, hardware that you rent, hardware you have physical access to. Hardware you will never ever see. The administrator, the developer shouldn't have to worry about where an application is going to be deployed. It should work the same way. It should behave the same way. It's just the infrastructure layer that is going to make that invisible for them.
05:45 - Chris Wright
To me that really shouts out extending the hybrid cloud from data centers on premises and out to public clouds and then including these edge deployments or think about that as an edge cloud. Really extending what the hybrid cloud is capable of and that consistency and interoperability and scalability to me are really core. I really enjoyed your article. And I got something out of that so that's why I wanted to give you a call and hear more directly what your thoughts were. So I appreciate it, thank you.
06:19 - Nick Barcet
Thanks for reaching out, Chris. Always a pleasure.
06:24 - Chris Wright
Think about just how quickly things have advanced. 4G services only became widely available in the US in 2012, less than 10 years ago. 5G will provide higher bandwidth, ultra-low latency and hundreds of new IoT devices. The influx of data from these devices drive the need for edge computing. The 5G and edge compute transformation is just getting started. In another 10 years, when we're onto the next wave of innovation, what will we be taking for granted?
06:55 - OUTRO ANIMATION
About the show
Technically Speaking
What’s next for enterprise IT? No one has all the answers—But CTO Chris Wright knows the tech experts and industry leaders who are working on them.
