With the dawn of the cloud era, many companies have done away with their data centers. But there are reasons why some things should, or even must, be computed and stored on-site again.

Yesterday in my parallel universe. I'm under attack from the Grumps. An unpleasant, parasitic alien form that feeds primarily on frankfurters. Although I have all the defensive tools, lion mustard, horseradish, chili sauce, etc. at the ready: they overrun me and eat my supplies. Oh my god, how does something like this happen?

It's called latency. The time delay that the data experiences to the data center and on the way back prevents me from defending myself against the attackers of my aggressive gourmet game. Computer gamers, especially professionals, suffer particularly from this physical characteristic. Edge computing, then, could be the Edge of Glory for "gamers" because their reaction times are not delayed by latency.
In road traffic, delays cause hazardous situations. In order not to conjure up the latter, in the not-too-distant future the existing IT infrastructure must be able to process large amounts of data in real time. The aim is to maintain the flow of traffic, calculate new routes if necessary, or even prevent accidents. It is then all the more important to perform specific calculations at the scene of the event. Cyclists in particular can tell you a thing or two about this.

The volume of data will explode on the Internet of Things. But not all data has to be transferred to the cloud. For example, when it comes to locating objects in a warehouse, edge computing can significantly reduce the amount of data that needs to be transmitted but increase the accuracy even more. While conventional GPS receivers and cell phones have an accuracy of two to thirteen meters, ten to thirty centimeters can be achieved here, depending on the network. And that in real time.

Last but not least, of course, data security should not be underestimated. Not all providers work with the high security standards of an Open Telekom Cloud. And the encryption of data can be highly complex in some cases. That is why some data must not leave the company environment. Local data processing keeps the data within the company, which significantly increases data security and enables full data sovereignty. And a certain degree of autonomy means that applications continue to run even if the connection to the data center fails or is very slow. This can be existentially important for production lines of automakers, for example, or remote locations such as oil platforms.

I fended off the last Grumps attack, by the way.

Interested in more information?

IDC, Intel, AWS and T-Systems will discuss whether there can be a paradigm shift in data-driven transformation with edge and cloud on December 7, 2021, 10:00 CET. Therefore, a pan European survey involving over 300 Senior leaders covering the aspects of IT infrastructure, IT strategy, Data utilizations, Adoption of IoT, Edge - Cloud Solutions was conducted. The results will be presented at the event. The event will be held in English. You are welcome to attend as a guest.
Event details and a whitepaper edge computing you will find here.

Attachments

  • Original Link
  • Original Document
  • Permalink

Disclaimer

Deutsche Telekom AG published this content on 30 November 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 30 November 2021 09:20:08 UTC.