Friday, July 19, 2024
No menu items!
HomeCloud ComputingThe Modernization Imperative (TMI): The beauty in boring

The Modernization Imperative (TMI): The beauty in boring

You have something in your pocket right now that possesses more computing power than spacecraft fifteen billion miles away from Earth. No, not your mobile phone — the key fob for your car! Voyager 1 and Voyager 2 launched in 1977, and forty five years later, these little rascals still work and send data back to Earth while moving through cold, empty space at over thirty thousand MPH. Think your key fob will still be going strong in 2068? It’s also amazing that the Romans created aconcrete so long-lasting that two thousand years later, structures built with it are still in use. Meanwhile, the roads outside my house develop potholes if a rabbit sneezes on them. 

I’m a fan of durability because it allows you to think outside the box. If I know that my foundations are safe, I’ll be more inclined to push the boundaries to innovate outside my comfort zone because the environment can handle it. Durability needs to be a cornerstone of the application modernization conversation, as each application and environment we create for our customers is the first step of their next-generation technology and business strategy. An entire organization’s goals and dreams rests on the shoulders of the technology architectures we are building, and we need to ensure that the foundation has the strength to withstand everything the future can throw at it — and then some.

When it comes to selecting the appropriate foundational technologies for modern applications, it’s crucial to consider the longevity and reliability of the technology. Will an exciting new open-source large language model be around in a year? Or that interesting new database from PayPal? How do I choose something from the ever-growing CNCF landscape diagram? This is where the strength of open-source communities and vendor backing comes into play. For instance, Kubernetes, PostgreSQL, and Java have stood the test of time, thanks to their robust feature sets, dedicated communities, and strong vendor support.

Kubernetes provides a scalable solution for managing and deploying applications, with Google’s strong backing: As of July, we’ve made over 1,000,000 contributions to the k8s project — 2.3 times more than any other contributor. PostgreSQL, meanwhile, is one of the world’s most advanced open-source databases and offers a comprehensive set of features that cater to a wide range of data processing needs, with a vibrant community constantly enhancing its capabilities. And Java, a general-purpose programming language, has been a staple in the development community for decades, providing a reliable platform for building robust applications.

Choosing an established technology not only brings the benefit of a mature feature set but also the assurance of continuity. A new database or language model may be promising, but they lack the track record of these tested technologies. The risk of adopting such new technologies is their potential discontinuation or lack of support, which could jeopardize your application’s stability and longevity. 

The Voyager team’s use of aluminum foil is a great illustration of this principle. They chose a simple, reliable, and available solution to protect sensitive instruments during their mission. The choice of aluminum foil might not have been the most cutting-edge or exciting, but it was practical, reliable, and ultimately successful. Similarly, when choosing foundational technologies for your modern applications, sometimes the “boring” choice is the best one. It’s not about chasing the latest trends; it’s about choosing what works and stands the test of time.

Vendor backing is another critical consideration when choosing foundational technologies. A reliable platform provider that runs these technologies ensures a high-uptime Service Level Agreement (SLA). For example, Google Kubernetes Engine (GKE) offers a 99.95% uptime SLA, while Bigtable “just works,” and Cloud Storage doesn’t lose data thanks to a design that supports 99.999999999% annual durability.

Boring doesn’t mean blah

That’s not to say we shouldn’t experiment with new technologies and encourage our customers to do the same. Everyone needs an innovation strategy. The concept of an ‘innovation spectrum’ is helpful here. This spectrum represents different degrees of technological innovation that companies can employ based on their specific needs and capabilities. On one end of the spectrum, there’s incremental innovation, which involves making small improvements or extensions to existing products, services, or processes. On the other end, there’s radical or disruptive innovation, which involves creating entirely new products or services that can potentially disrupt entire industries.

A classic example of balancing cutting-edge technology with “boring” or legacy technology is seen in many financial institutions. They might use AI and ML for fraud detection or predictive analytics while still relying on tried and true technologies for their core banking systems. This blend of newer and older technologies allows them to benefit from the latest advancements without jeopardizing the stability and reliability of their critical operations. However, as the banking industry is finding out, that can also come at a risk of stifling innovation and can cause customers to look elsewhere.

For developers, Google Cloud’s Kubernetes-based platforms present a similar balance between innovation and stability. For example, researchers can leverage cutting-edge GPU sharing in GKE to explore the origins of the universe, while the BBC uses Cloud Run serverless containers to keep up with demands of a very busy news day

Adopting best practices like platform engineering can provide a robust foundation for rolling out new technologies. Platform engineering focuses on creating a stable, scalable, and secure platform that allows for rapid deployment of applications. GitOps is another important practice that involves using Git as a single source of truth for declarative infrastructure and applications. With Git at the center of the delivery pipelines, developers can use familiar tools to make pull requests. Changes can be rolled out or rolled back easily, making the process of adopting new technologies smoother.

When it comes to modern application development, developers need to be able to trust that the foundational technologies they choose will be reliable and durable. Without this assurance, developers may be hesitant to take risks or explore creative solutions. To give them the confidence they need, an effective platform engineering strategy can provide a strong foundation for rolling out new technologies while ensuring stability and security.

Boring can be beautiful, especially if you’re building for the long haul. Regardless of what you’re developing, from roads and rocketships to microservices or network architecture, the fundamental structure needs to withstand everything the conceivable future can throw at it. A solid, durable foundation offers developers the capabilities they need to push the boundaries, and the reliability they need so their brainchild is still humming along, 15 billion miles away.

Cloud BlogRead More



Please enter your comment!
Please enter your name here

Most Popular

Recent Comments