Dissecting the Future of Edge Investment and Use Cases for Mass Adoption
Understanding how to reduce management costs and answering the other big questions around the future of edge deployment and growth
On November 2, 2022, five experts in innovation sat down to share innovative use cases of edge computing and understand their trajectory in different sectors.
The discussion, led by Eric Conn, Managing Director of IoT For All, was part of The AI Summit & IoT World Austin event (now Applied Intelligence Live! Austin), was intended to recognize why the Edge is integral to AI & IoT.
Showcasing the challenges and considerations related to implementing edge computing in the context of manufacturing and retail systems, this 40-minute discussion highlights the importance of data aggregation and integration from the plant floor, emphasizing the need for secure data transmission and network infrastructure. And touches on managing the ongoing shift of IT responsibilities from on-site personnel to the cloud, and the subsequent concerns about managing and maintaining edge devices in the absence of specialized skills!
Watch it on-demand now.
Thanks to our panelists:
- Dan Whitacre, Senior Director, Kroger Labs & Technology Transformation, The Kroger Co.
- Jeff Dymond, Global CTO, Edge & IoT Partner & Industries Office, Dell Technologies
- Jason Mccambpell Jason Mccambpell, Director of Architecture, Wallaroo
- Gloryvee Cordero, Senior Architect, CVS Health
I have a bunch of very esteemed panelists tonight today. Let them introduce themselves. I'll start with Gloryvee and I guess I'll just pass them.
Hi. Can you guys hear me okay. Oh, Hi, I'm Gloryvee Cordero, and I'm from CVS Health.
Hi, my name is Jason McCampbell. I'm director of Architecture at Wallaroo AI.
Hi, I'm Dan Whitaker. I'm director of Kroger Labs.
Hi, Jeff diamond CTO of manufacturing from Dell Technologies.
All right, we're very honored to have this great group of panelists. We're gonna get edgy today. So, the first thing we're going to do is we're going to let everyone get a get a stab at this first question, which is, why edge and why now. So Gloryvee, wanna start that?
Well, I think that, you know, edge is at its infancy for where we're going and with what we're doing, why now - because it's a perfect time, we really do need to elevate, how we aggregate data and the data maturity, that is being processed for IoT. And for, you know, a lot of what is being done within systems, internal operations, you name it, edge is having that power to pushing everything we're doing forward and getting it ready for things like quantum and things that we do believe are going to be very impactful in this industry.
So why edge to me is, well, this is the edge the real world, right? And we want to be able to bring the insights that AI and ML models can offer to where we are and where our machines are. And there's actually a lot of examples of edge AI already out there. Look at how many voice assistants we have thrown around our homes. I think the why now piece is we have conquered this first set of deployments. And now we're looking to take on some new challenges. And by that, I mean if we look at how the virtual assistants and voice assistants are built, it's usually a small model running on device. And if it matches a key phrase, something like, hey, whatever, we stream the audio up to the cloud. And that works well, if it's my kids trying to take some questionable phrases and translate it to every language they can think of. And if it takes a couple seconds to get an answer back, who cares. But there's a lot of problems where we need latencies not in seconds, but in 10s of milliseconds or single digit milliseconds. So I think what we're starting to see is a lot more opportunity and deployment options, things like on chip AI accelerators, so we can run more complex models on device, micro data centers and cloudlets. That will instead of having to deploy a server locally, we can take advantage of some of the on demand compute capabilities from the cloud, but in a much tighter latency radius. So you know, I think it's a lot more options will build a build a lot more things. And I'm looking forward to seeing all the sorts of applications that all of you and we as an industry can build. Why not?
Exactly. That's it, I think he had it all for everybody. Because I think the biggest the biggest takeaway is that latency has always been of concern when you're processing so much data. And this is allowing us to really push the envelope with technology and taking it to that next level that we need.
So more than just why not. So the industry is highly competitive, almost every industry you are in. That's true right now, and it's all fast pace. So you really have to think about how fast you're going to work. We have an old saying that we says that the competitive advantage belongs to the person that can observe data and then act intelligently on that data as fast as they observe it. So processing at the edge collecting data fast making decisions fast is key to everything you're doing especially in retail.
That was much better, that's better. It's better. Why is better? Everything I'll say is related to manufacturing. So one of the reasons for edge and this is my opinion is that I think the world as much as I love clouds and the all the clouds do for us I I think we over rotated a bit in manufacturing, I think we thought it was going to solve world hunger and it didn't. But more importantly, I think there's more data that are coming off the factory floor. And now the automation systems databases that do different things. For instance, when I was a young engineer, we looked in, when the books closed once a quarter, we could tell how well we did you know, that quarter, then it got to be once a month, then it got to be once a week. Now, it's once a day. But really, we want to make it once a shift. So there's more data coming off the plant floor, but what do I do with it. And so we always talk about data and context per persona, not all data is created the same, not everyone needs to look at or consume that same type of data. So edge is a place where those workloads can just stay in the plant to run the plant. For instance, there's places for the cloud, take the answer to the cloud, especially if you're doing macro, AI and ML, things like that, and you're doing execute the algorithm at the edge, but maybe create the model in the cloud. So edge is very important. I think it's mainly to be more, I hate using marketing terms, but I'm gonna use one here, it's more agile, we have to make our products faster and better than we did before. I tell the story that there isn't any less trees in the world. And you can argue the environmental stuff, but and there's not any fewer pulp and paper mills in the world. But we're standing in line at the grocery store for paper products. Why it's because supply chains broke. And part of that is getting data off the plant floor so we can make better decisions throughout our workday.
All right, great, great wisdom from this panel here. So another reason that edge I think is important is the monetary aspect, right? And I'll sort of hand it back to Jeff to talk about that, and why it doesn't even make sense economically to send all of your data to the cloud, because it'll cost you a fortune. And the latency on top of it. Right? So maybe, maybe, Jeff, you can dig in a little more on the manufacturing side, what you seen at Dell, and through some of your customers eyes.
Yeah, I think when you start talking about latency, I think one of the things you first have to think about is at edge, you're not controlling anything in the plant, right, you're not stopping and starting to pump, you're not opening closing valves, you're not turning the gas on a fired heater, you're running analytics about the production, right, in support of supply chain. And so when you start talking about latency, if you send all that pressure, temperature level flow, vibration, acoustics to the cloud, that's pretty inexpensive. But when you run the analytics in the cloud, and try to get your own data back in a form that I can make sense out of, that's where they make money, right? That's what costs you a lot of money. So a lot of our customers are finding out latency is one thing, I can architect around a lot of that latency unless you're doing, you know, self-driving cars type of example. But in manufacturing, yeah, it's a little bit later going to the cloud. But it's really around the monetary reasons of how much money it costs you to get your data back out of the cloud. And customers in the manufacturing world are finding out that that cloud bill can be profit, it goes to the bottom line, still love clouds, there is a place for him. I'm just saying maybe we need to look at data because it's not all created equal. Some data is only good for 10 minutes. I don't need it after that. Some data I need to keep forever, especially if it's a regulatory reason. But why send it to the cloud if I only need it for 10 minutes to make a business decision to how to run my plan - financial reason.
Yeah, and maybe Dan, you can follow up on how Kroger is thinking about edge and cloud. What kind of workloads what type of use cases make sense in in the setting that you work in for edge versus, you know, when you would stream all this data to the cloud?
Okay, well, just what Jeff said here, important thing that you have to think about is latency and data and how close it is. It's really dependent on the use case, right? If I don't have to act on it till tomorrow, why am I you know, I don't need to keep it next to the edge. I can keep it in the cloud. If I don't need it after tomorrow, do I really want to keep it maybe, maybe, and so forth. But if you look through a manufacturing facility, there's all kinds of things that you want to have devices attached to capital equipment and understanding how that thing is being used and when maintenance is needed as an obvious use case. But anything that's going on in a manufacturing facility and everything that's moving, everything that's changing posture, is something that's worth looking at. If I'm looking just at emergency exits and see if they're blocked. I do need to act on that. That's a safety issue. If I'm watching how someone is doing a process and the process is introducing air into whatever I'm trying to create, they're putting apples in the banana cake. Instead of bananas, that kind of stuff, we have to cut that from happening, you want to respond to that quickly. If I'm using robots to pick product through a distribution center, those things are moving really, really fast. And so therefore, I have to be able to get that data fairly quickly. But to plan how these things are going to work, yes, you'll shove it out to the cloud, you'll look at it over time understanding how trends, understand how to do optimization, and so forth to make it work. So every, not sure if I answer your question completely. Every edge environment has a need for close to the edge data and edge in the cloud.
Yeah, no, totally makes sense. We're gonna sort of bounce over to Jason to talk about some AI and ML. So the question is, really, what are some of the challenges that people are encountering at scale, to running and maintaining all these different AI models?
Well, one of the areas that I'm interested in is, when you deploy an ML model and AI model, you really need to be watching what it's doing, observing it doing statistical analysis, and know if it's gone off the rails, it's now you deployed out on the edge, well, you still need to be observing it right, you still want the data coming back. But very much like what both of you are saying, that can be problematic if I work for a company making manufacturing equipment, and I throw a nice ML pipeline on it, and install it in Kroger say, and now that pipeline has sent me back all sorts of observability data for me. That may not work out so well, for the factory owner, who now looks at it as I'm sending back images or sensor data about their proprietary IP. So very much along the discussion we've been having in terms of, do you send it to the cloud or not, I think there's going to, we need to also push observability out towards the edge, where we can do something a little bit like federated learning, and we can analyze the data. We don't want to stream video up to the cloud if we don't have to analyze it there and instead, send a summary statistics back. So I think that's gonna be one of the really interesting challenges is how do we monitor our models, and know that they're actually behaving as we expect?
We're gonna pass it the Gloryvee, if you have any, any experience on ml on Al, AI at the edge?
Yeah, the biggest takeaway from what Jason is saying is really that to automate, and to really be more efficient, you really have to have monitoring systems that do monitor the edge itself and the system processing. You know, obviously, we are all at our infancy here, right? We're learning together, but the fact that it needs to be deployed, and we have to start to push it forward, really demonstrates what its capabilities are going to be or what we can make them. And the I think that's one of the biggest takeaways that I take in looking at AI models and integrating them on edge and saying, Okay, let's take a look at how we can become more efficient. How can we selfheal internal systems? How can we make things more resilient and have a single source of truth for us at CBS, we're looking at integrating digital twins, and utilizing on edge digital twin models, so that we can actually start to automate, and then start to create those opportunities that exist, and we just don't know them. But we will know them because we'll have the forecasting and the automation that will lead us to better and more efficient ways of doing things.
That's great. I know from some of the experiences that I've had just drift on AI models is something you really need to monitor. Some people like if you're a very naively thinking about ML and AI at the edge, it's like, hey, we train it, we get the data set, perfect. We send it out there and it's just operating at 99.9%. But that's not really what happens. Over time, the environment changes the images that are used to continue to refine it may be in error, there could be things moved in front of us in computer vision. So, this whole idea of ML ops of testing, updating models is really, really important because if you're going to be making decisions at the edge with these models, you have to make sure that they're actually making the correct decisions and not just once but constantly. So that's a really active area I think as AI becomes more prominent in every industry. So the next question we're going to talk about is this this is this was a funny we were talking about earlier. What challenges will edge and IoT exacerbate. And that was the that was the point of humor was Jeff was wondering what exacerbate meant. Now he was only kidding, though, but but this was Dan's question he threw, he threw a big vocabulary word in there. So Dan, do you want to take that one,
I'm not going to try to pronounce? Well, I could put in a document because spell check for. So AI ML models, they're all great, they're really fantastic. They are going to create additional challenges for us, obviously, data is the most important part of that right? Well, maybe not the most important and important part of that. Do your organization's have a good data strategy, now a good data, data ops, process and so forth, because it's going to become more and more important. And also, the types of data that you're now going to collect is different. So we'll have cameras that are collecting video data, we have temperature sensors that have temperature data, humidity sensors to do those kinds of things. And so for all this stuff is going to have to be brought together in a way that an A AI model can use it, and that it's a repeatable process. And therefore, when it goes out into production, it has a feedback loop. So if you know if the ML model is working, but that's just the start security is going to be a problem now, because you're going to start moving data around where it can act on it at a point of decision making. And so how do you secure that data? And how do you combine that data, if you're in an organization that has different kinds of data for as for example, Kroger is a retail company, we're also a healthcare company. If I healthcare is protected, data, retail, maybe not. But if I kind of merge that data together, then my retail data might become protected under health care laws, I just have to know those kinds of laws and those regulations, as we come be doing more as a second problem. Third problem is as you do more and more things, so think of a grocery store, a grocery store has got a couple 100,000 products in it, there's money people in it, there's all kinds of associates going on it, there's 1000 things going on at one time, there's 100 different systems running at one time, each one of those systems is selfishly optimizing for itself, that may not be the next best action for that store. So you really have to start thinking about system of systems engineering, and how do you have an orchestration layer over your IoT systems to do the best thing for that store, the best thing for your customer the best thing for your associate?
That's great. I love the hierarchy of IoT, right? You need like, just like models, a human human world, right? Where you need decision making, at a strategic level, that tactical level, and you're going to need AI and ML to handle all these different things and be able to know who overrides who, you know, at a given time. Jeff, now that you you kind of know what exacerbate means he will you want to follow through on that on? on Dan's line of reasoning,
I guess I'll explain as I understand what it means. The systems of systems thing really resonate, especially in manufacturing, and today is that, you know, when you're looking at the edge, and you're gathering data off the plant floor, there's it's really a multi legged stool that you have to think about, right, you've got to be able to get that data into the system. So that aggregation, that integration has been hard in the past, I think most people have solved that today. 250 different protocols and any electrical medium, you want to bring it in on so we can get the data in, it's got to be secure, to your point, not only the data has to be secure, but the network has to be secure, also. And then to be able to manage those edge devices. So how do I do that, you know, I took a lot of the IT folks out of the plants, again, I'm coming from a manufacturing viewpoint or lens here. And because they moved a lot of that to the cloud, maybe moved it to a common data center. So there's not a lot of IT folks at a factory anymore. So when I put an edge device out there, I put some compute in storage out there in my factory, what happens when I need to maintain it and manage it, I need to do an upgrade to the OS and upgrade to the security agent, I need to upgrade to the box itself, maybe the application that's running in it. And I don't have that skill set at the plant. So we really need to know how to deploy manage those boxes out on the edge. I think we figured that out. The degree of that changes depending on what type of subset of manufacturing that you're in. But self-healing is one thing but self-deploying where I turn on the power and it figures out its personality and it's downloaded to that edge box out there. So there's a lot of things that edge complicates in some aspects. But again, when you think about the new norm will be production assurance. And the biggest use case you can think of now is order to cash, turn it in, you know, get an order to turn it into cash. Underneath that are the supply chain algorithms that go from Available to Promise, Capable to Promise profitable make, etc demand planning, finite capacity scheduling those kinds of things. But production of sure assurance to be able to take the output of several AI models, be able to do risk analysis that I can produce next month, not what I did now. But how can I how am I going to do next month and do that? What if scenario? So, we're working on this because it's a higher level of system of systems. It's not just one AI, it's middle, many AIs, but you have to think of it of how do I run my business? Not just today, but how, how is it going to be next month or the quarter after that? I don't know how long we can make that, but maybe at least out a quarter.
Yeah, that's really where the data at the edge, you know, the predictive nature of it, your ability to do forecasting and things like that, where if you have the data, you can actually somewhat reliably see what the future might hold. That's where every industry wants to get right. Whether it's for customer experience to tell you, you will only be here for 15 minutes, if you're getting your car service, because I have 10 million cars before you I know exactly how long it takes to do this service. And the people were here today, and the machinery has been PM-ed and everything's ready to go. But it's a very as you say it's a system of systems people were involved in the whole discussion of AI gets to, are we going to lose jobs to AI? Are we going to create more jobs? At the more I talk about it, I think we're creating more jobs, or just higher order jobs and manage all the API's and build all the API's that we have. Men have worked for us, right? So I think we as As humans, we just keep leveling up. As long as we don't get the Skynet type of thing happening to us. Right. So so we're good. And the last question I want to pose Gloryee is really how are you using it at CVS Health? How are you using edge in your in retail, and healthcare and pharma. And like, if you have like specific use cases, you maybe could describe that would be great.
Sure. So I went over some of the digital twin models that we are integrating our distribution centers, so that we can actually utilize the power of having forecasting and enabling a lot of the events that we would like to automate so that robots and you know, these IoT devices that we're programming can actually allow us to be more efficient at distribution centers, but also to be more efficient in our pharmacies, right? We have pharmacy techs that we need to train and we want to have glass devices, so that they can actually have everything on an IoT device that enables them to on edge, right, it has to this is real time data that has to be aggregated to process a lot of this information. And it's, it's a lot of complex information that needs to be aggregated and on edge, we were able to do that, so that our pharmacy and our pharmacy techs can process a lot more prescriptions quicker, and much more efficiently. And it allows us to then be able to move that forward as well and give the customer a better experience. So what we've been looking at is looking at all of our customer experiences and how to maximize that by going on edge with a lot of the initiatives that we have been taking, and putting together a good plan of action so that every single division at CVS gains something from going on edge. And of course, cost efficiency is one of those factors.
That's great. And, Jason, I'll hand it off to you before we do another round robin question. So I'm just curious to see, you know, within Wallaroo, what type of AI ml applications are you seeing are sort of popular, you know, across different verticals, are there certain, certain things you're seeing customers coming to you for?
Yeah, so it's really interesting. And we've been talking about retail here, and one of the trends I've seen lately is there's a lot of computer vision that looks like it's being applied towards retail towards pharmacy towards grocery. And I don't know the real reason behind it, but I'm a suspect it's because you talk about a retail environment like that and it's about as unstructured as it gets. So, you know, we can't put sensors on the people shopping in moving through, probably don't want to put sensors and expensive equipment. I need to the shopping carts. So there seems to be a real trend towards using vision, to do security to do product identification to do analytics on what's going to the store. And, you know, you guys can talk more about that. But that is one of the real trends we've seen in terms of both retail, but also service providers wanting to provide more capabilities that you can install into those environments.
And I think retail is a great place for all the reasons you just mentioned. But also, retail has historically had security cameras in there, right. And so customers are used to being videoed, when they go on the store, they sort of are accustomed to that the stores are used to getting that data. And so I think there's already infrastructure present, there's already sort of within the business on acceptance that we're going to do that and within the consumer. So it's kind of a natural place, and computer vision is so powerful, there's so many things you can do with it, you know, analyzing complex behaviors now. So I think retail is a really strong focus for a lots of computer vision types of applications. And I guess the last question, we'll just sort of bounce this around me about popcorn, a good a dance, since he's looking at me kind of like bored. So what are the foundational capabilities, an organization needs to be successful with edge and IoT? Like? What kind of skill sets are we looking at here?
A lot. It's, there's quite a bit, I mean, you have to be very good, and what I'll call in the sciences of Al, ML, and so forth, there certainly has to be very good at video, you certainly have to be good at data fusion and understanding the different types of data that can be put together, you have to figure out how to fuse day together at great skill. So data engineering and data engineering capabilities, I can't emphasize that enough. Maybe that's because I've done data for so many years, you need a lot of really good data people. They're great. I love them. So you really have to know how to do that. Architectures around we talked about systems of systems engineering, and how to build that kind of an architecture, how to build the orchestration model on top of it, it is not a typical architecture that I say a software engineer would start with out of school. Because you typically think of architect or a software engineering practice is sort of one problem. What your drone to do, and right now is you've got 100 problems, and the architecture has to solve all 100 problems at one time. So data architecture, scientists like AI certainly have to understand security very, very well.
Well, we can try Jeff can fill in the rest of the blanks. I'm waiting for someone to mention it. Just the the actual installation and maintenance of all this physical stuff.
Yeah, there's another group call it but one of the things I want to mention besides that is that when you do these edge solutions, use cases workloads, whatever you're you want to call those, and again, through a manufacturing lens, and I think it crosses over a little bit is technology is 50%. The other 50% are the people and change management and culture. You know, Peter Drucker said that strategy, our culture eats strategy for breakfast, I think is what he said. So when you start doing edge solutions, you start doing, you're trying to make money, save money, help your company work safer by implementing these edge solutions. If it doesn't do one of those three things, don't do it. And 50% of that, to be successful of that would be the people. So not only do you have to have people that are smart in the technology side, but also to be open enough to think a little bit out of the box to work a little differently than they did before. And that's how you're going to be successful in the end.
That's a that's a great point that people side of this is often overlooked. But it's so so key because if you have people resisting these new solutions, we're actively trying to deny them being effective. It's it's not never done that way. Yeah. Gloryvee, you want to you want to add some more thoughts?
I think that one of the things that I've noticed is that there is a big distinction between a tech and a scientist. And you know, the mindset of a scientist, being a research analyst and understanding how to explore and discover, right, how to really understand what is happening and why it's happening that's necessary. And it's limited with someone who is just technical, and just looks at how X plus Y equals Z and how, you know, there's a process in place or how to tech stack, the next application. So what we need is groups, and putting groups together that enable both of those pillars to exist within a group and start to manage those teams as a part of expanding your edge computing or your edge solutions within your different ventures.
Jason, will you get the final word on this? What other skill sets do we need to be successful in Edge?
Well, it's a great list of skill sets so far. But I guess the one I would add, maybe that to that is finance. Because we can draw up all of the different software architectures we want. We can deploy servers into our retail outlets in our semi autonomous warehouses, and deploy a bunch of AI models. But if you go stick a model on a very expensive GPU based machine, you may find out that it's costing you $1 per inference or dollar per image recognition. And you just saved yourself 50 cents on it. So I think that's one of the aspects that some organizations overlook is how much is it going to cost me to do all the data wrangling to capture the data to put the models in production, and then actually utilize it and try to get something out of it.
So the last part I forgot to mention, is when you look at the people aspect, look at some things called Human Centered Design. And there's courses on it that you can take, but it's really how different groups work together and orchestrate a work, load or use case to help your company. So if we're working together, how often do we speak all the time? Well, how do we speak? On the phone? Okay, we need to sit next to each other, right? How often do we speak once a week? Or how do you speak, email? Okay, we don't care where each other sets. But so human centered design is something to look into, you can take courses on that get certified on that it's a really good practice when you start talking about edge solutions.
Great. Thank you. We have about five minutes left. So I want to make sure that anyone in the audience has any questions they'd like to ask any of the panelists. I guess, just raise your hands. I don't know what the protocol is here. Or just shout above your neighbor. Whatever works. Any questions from anybody?
oh, we have a question. Right here. row three. Oh, it needs a mic. Okay. That would be tough without a mic.
might be easier like this? Yes. What you mentioned that on the edge, getting all that information, you want to run analytics on it. But then some of the data doesn't come through sensor data gets lost. And then the models don't work anymore. They expect certain data input. How do you solve that problem that you want to run analytics on that edge data, but the edge the data received is always different. Because there's gaps there's data missing what you do then?
So at least on our end, the intent is to have a digital twin. So that we can actually start to monitor the edge and monitor what is happening on from the physical space to this virtual simulation. And that automation process is then able to be regulated, and we'll try to self heal as we see those events occurring. It enables us to go back and calibrate right and adjust so that it doesn't always deploy. And it understands that something has happened. And there needs to be a fixture. You know, especially with IoT devices and the 5g, coming, 5g and edge together that combination. Those are things that need to be looked at, and they need to be monitored from a system perspective.
One of the things we look at is do we have enough data to make a decision? Do I really need all the data, an example from 25 years ago when we would be collecting data from our stores? At that time networks were pretty fragile. So if I have 2000 stores, I only have 1500 to be able to collect data from 1500 stores. Is that enough to make a good decision? For some? It is for some it's not. So it's really the risk model around the decision. How much data do I really need to do this? If there's a gap? Can I make it this? Go ahead and make a decision?
Any other questions, I think, did you have a question?
Hi, yes, thanks. It's something I was curious about. We talked about the difference between bandwidth issues of sending everything to the cloud versus latency issues and trying to keep latency extra low. Bandwidth issues are a little bit more of a current issue right now as opposed to ultra low latency. However, is this a case where you think that over the next several years, as edge deployments become more widespread, 5g becomes more widespread that applications will begin to catch up and will start to say, Oh, we have this edge infrastructure. So now we do need to do ultra low latency like they'll start to have applications that will then require kind of like it's evolving as it goes, where you'll they will say, Okay, now we do need sub five millisecond latency, etcetera. Everybody care about how?
Just to answer something about 5g, and one gigabit one gigabit to the edge? And I'll probably be that'll probably be close.
So I have this belief about networks, no matter how big it is, it's going to get filled up, because that just happens, right? So what really is important to me is understanding my data strategy, right? Data has gravity, I've got to get just enough data to the point where I need to make decision. So whether it's on the cloud, whether it's in the device, whether it's a storage device in the store, understanding exactly what data is needed, and only what data is needed, it will help me beat the bandwidth problem, because somebody is going to fill that pipe up I know.
So I'll take a little bit different crack at it, because I only heard half the question, 6g Not 5g. But I'm serious about that. One of the things a small commercial for us is when you look at edge, you need to look at that infrastructure being elastic, repeatable and secure. So as your data load grows, as the need for more data, or additions to the plant, or changes to the plant, you need an infrastructure at the edge that is elastic, right? And even if you go to the data center, how do I grow my compute? How do I grow my storage, and not break the bank? Right, so maybe I need to go up for a month and come back down for a month. What goes along with that is the infrastructure of the networks and the bandwidth. So depending on in the manufacturing lens, again, if you're looking at a brownfield plant, there's a lot of metal around, you may need to look at see if 5g is the right, you know, economic, even reason for you a brand new plant where you could put most of the steel on the outside. 5g may be great. So but when you start looking at six and seven, it's coming and look at the specs around it. You're going to figure out that now. Connection is going to be to whatever has the bandwidth available at the throughput that I need for each device. And you don't have to think about it.
All right. I think we're at time unless there's any more questions. Nope. Okay, we are at time we got the hook. Thank you all for attending. Please give a hand for our wonderful panelists. Well, I'll be hanging around if you want to want to ask us anything. Thanks again.