How CDNs can adapt to the cloud computing era
“Build like the cloud; deliver like the cloud” is an internal mantra we have been using in our Development and Product teams at Verizon Digital Media Services for the last two years. Cloud computing, in which remote servers, virtualization, Infrastructure as a Service (IaaS), and orchestration tools make the deployment and scaling of applications easier, is quickly becoming the standard way that software is built, deployed and delivered.
This puts content delivery networks (CDNs) in something of a predicament. While customers crave the speed and ease of the cloud, they’re unwilling to compromise on the performance, reliability, security and scale advantages that purpose-built CDNs still provide. That means CDNs have to continue to meet and exceed customer expectations, while at the same time finding new ways to integrate with cloud computing platforms and tools.
Here are some ways in which we are adapting to keep pace with the cloud paradigm.
Build with cloud tools in mind.
Customers have always used application program interfaces (APIs) to easily integrate their applications with third-party services. Our Edgecast Content Delivery Network differentiated itself early on with a foundation of enabling customers and partners to easily self-service their CDN accounts and configurations via portal-based tools and APIs. Increasingly though, customers are adopting cloud APIs and DevOps toolsets to integrate applications and other workloads into the cloud. As the application development life cycle progresses from development to testing to staging to production deployments, popular configuration management and orchestration tools such as Puppet, Chef, Salt, Terraform and others are used to automate and regulate these activities and stages.
One of the ways we are adapting the CDN to support these tools is by using them ourselves. While CDNs have always (and likely will always) push the envelope of performance by optimizing using bare metal servers and full-stack tuning, we have been using cloud environments for development, testing, staging, prototyping, data/analytics and other uses. This has made our own processes more agile and robust, while letting us learn the art and science of DevOps centric, testable-by-design, CI/CD-friendly development practices. In other words, it has taught us to build the way our customers build and to adapt our own configuration tools and interfaces to what our customers expect.
As we have adopted these practices, we have started building a set of interfaces and tools we are calling EdgeControlspecifically to provide this functionality to the Edgecast CDN. It was as natural as looking at what tools are popular and useful in open source and DevOps communities, looking at how they’re leveraged against compute, storage and network infrastructure, and seeing how we could fit our CDN into those tools and workflows more organically.
The cloud works in real time. CDNs should, too.
It’s easy to see the cloud’s appeal to customers; it’s a faster way to deploy software than ever before. According to a Vanson Bourne report on the business impact of the cloud, there’s more than a 20 percent speed improvement in a software’s time to market when it’s hosted in the cloud. When an application is deployed into the cloud, it’s dynamic and quick, as well as adaptable; a developer can just as easily spin up one instance or multiple at the same time, often using automated orchestration and deployment tools.
A CDN needs to work just as quickly and efficiently. If a customer is launching an application in the cloud, it’s the CDN’s job to ensure it is incorporated into that custom automation and app deployment process at the same time. Testing the application and then testing the CDN shouldn’t be a two-step process for customers to consciously think about, but rather two things that can happen in tandem. The CDN can play an appropriate role in the same testing process, and when it happens concurrently, developers can get data back out of the CDN during testing for enhanced metrics and analytics monitoring. And having a responsive CDN during the initial testing step ensures that the testing environment replicates the eventual production environment as closely as possible: from how the software performs in different regions, to how it mitigates potential security issues, to how it handles load or offloads certain business logic to edge servers.
At Verizon, our increasing internal use of cloud infrastructure, tools and automation has shown us that testing the CDN as an afterthought can not only be tedious, but less efficient. EdgeControl was born out of discussing needs with our customers and partners, building to complement our own internal processes, and monitoring our competitors. It’s not enough to provide API hooks, configuration and propagation; these things also need to happen in real-time, so we’re working on making more of our components real-time as well, including configuration APIs, ingest and propagation and feedback.
Automate, automate, automate.
Finally, it’s vital for a cloud-integrated, real-time, responsive CDN to have extensive automation capabilities. The more that the CDN is a secondary, manual step in the application deployment process, the less it is likely to be updated and in sync with application changes, the fewer automated-testing processes are likely to incorporate the CDN component (including any edge-deployed application logic), and the less reliable and predictable the end-to-end deployment process becomes.
From code testing to software deployment, the end goal should be to have customers do as little manually as possible. Instead, they ought to be able to make long-term configurations in the CDN to reflect a new content profile, a new application or API, or even just changes in the application environment, such as new regions being deployed, elasticity changes in deployed instances, or other characteristics of the origin/cloud environment that the CDN could or should be responsive to. Even better would be if that configuration could be tied to the customer’s existing software configuration tool so the same automation processes and testing tools work together. The ability to automate in advance doesn’t just make software deployment easier, but also more reliable, since it can be configured well before anything goes live on the CDN. This has the added benefit of fitting into the more dynamic, continuous integration deployment model that is becoming more common.
For an application that lives in the cloud, the CDN is the infrastructure that sits in front of it. That means when a customer updates the application’s code, the CDN should automatically be aware of the change, and trigger an update in its configuration if need be. Our EdgeControl toolset is being developed with an eye toward how that process can be scripted in advance through an automated deployment process to avoid having our customers do a secondary configuration. The results will be a quicker, more efficient and more reliable deployment process.
Many of these improvements are as natural as asking ourselves, “How can we build this to be more like the cloud?” The answer will be emerging in new capabilities and tools we are rolling out now and in coming months which include, improving our configuration propagation times from ~40–60 minutes to less than 5 — a more than 90 percent improvement); exposing more native configuration syntax to allow for the full power of our edge performance engine to be exposed to developers; expanding our APIs and command-line (CLI) tools to increase scripting and automation integration opportunities with the most common DevOps toolsets and frameworks; and taking the covers off of extensive metrics, analytics and other data that can provide real-time feedback on performance and utilization, etc. We’re excited about this coming transformation and the promise of enabling cloud applications and developers to interact with our platform more efficiently and natively.
To learn more about how our CDN acts like and integrates with the cloud get in touch with us today.