IT Infrastructure is Increasingly Distributed and Complex

Infrastructure Administrators are dealing with unparalleled complexity. In recent years, organizations of all sizes and industry verticals have become more digital, putting pressure on IT to support new digital goods and services to customers, constituents, patients and more. IT infrastructure has moved outside of the core data center to more regional data centers or remote offices to adapt to the more distributed nature of work and use of IT. In addition, infrastructure choices continue to proliferate. Established, on-premises three-tier architecture faces increasing competition both within the data center from new technologies such as hyperconverged infrastructure and outside of the data center from public cloud vendors such as Amazon Web Services. Finally, the applications running on infrastructure are changing as well, moving from a monolithic design to distributed, microservices-based applications.

IT must assess the tradeoffs of the value of each of these infrastructure choices with the costs. Many IT organizations are finding public cloud costs unsustainable and struggle with compliance and governance. They also want to avoid creating infrastructure silos to maximize operational efficiency, even as application needs are increasingly diverse. These considerations must be balanced with the business's need to speed time to market for new digital products and services.

To help IT administrators assess what capabilities need to be in place for cloud native applications, 451 Research has recently published a paper on the Top Infrastructure Considerations to Support Modern Applications. You can access the full report here.

Containers Are Moving into the Data Center

Enterprise apps development and operations are changing. What used to be monolithic applications that were designed using the traditional architecture are now giving way to microservice-based applications.

These applications are highly scalable, and they can span geographies. Increasingly, the value of goods or services of a company is found in its software, so it needs to rapidly create, update, and maintain it To speed time to market for applications, companies are moving towards a microservice-based architecture, which essentially means applications are now a set of many services. Each of these services is independently developed, independently iterated, and lifecycle managed. This gives a lot of agility for enterprises in that they can get to their IP much faster, be competitive, and meet their users' demands much better in a microservice-based architecture.

IT organizations are increasingly looking to develop these new applications in-house. A recent study by IDC found that 54% of container instances will be deployed primarily in the data center by 2023*. A separate survey by IDC found that 97% of IT organizations use a Kubernetes-based platform for container infrastructure software**. IT wants to get more out of their existing investments, avoid cost overruns in the public cloud, infrastructure silos, and multiple operating models.

Unique Challenges Posed by Cloud Native Applications

However, the adoption of cloud native applications in the data center isn't plug and play. Kubernetes-orchestrated workloads bring their own challenges to the data center. For instance, due to their architecture:

  • Microservices pull the complexity out of the application and place it on other systems. Load balancing is a critical component for a distributed, microservices based application. Microservices may need to scale up or down depending on what parts of the application are in demand at any time.
  • Microservices need robust management and orchestration to deal with the complexity of hundreds or thousands of microservices replacing a single application. As an example of the complexity to be managed, each microservice can have its own data. One microservice can use traditional, relational databases, while others can use NoSQL databases.
  • Microservices are most commonly run in containers, and containers were originally designed for stateless applications. Developers and administrators need persistent storage (or volumes) for most applications.

451 Research has found that over 70% of containerized workloads require stateful services.

Hyperconverged Infrastructure (HCI) as the Foundation for Cloud Native Applications

Core tenets of application modernization, including automation and support for Kubernetes and containers, are a must in hybrid cloud infrastructure to ensure a smooth journey. Both automation and Kubernetes are now commonplace on HCI, which is increasingly used by DevOps teams for application development. Further, many HCI adopters now deploy HCI nodes and clusters on public cloud platforms and report that HCI is easing their hybrid cloud experience.

Next Steps

To read the full report on Top Infrastructure Considerations to Support Modern Applications, please visit this website.

To learn more about VMware's approach to Cloud Native Storage, applications and infrastructure, visit our website.

Take a technical deep dive at TechZone.

Sources:

* IDC Container Infrastructure Software Market Assessment: x86 Containers Forecast, 2018-2023,Doc# US46185620, April 2020

** IDC Container Infrastructure Software,Doc# US46185520, April 2020

Attachments

  • Original Link
  • Original Document
  • Permalink

Disclaimer

VMware Inc. published this content on 07 December 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 07 December 2021 18:31:04 UTC.