IAAS, CLOUD, AND MANAGED SERVICES BLOG
A troubling report was published earlier this month by IT consulting firm Antithesis Group. Through a partnership with Stanford University and TSO Logic, the firm examined the state of data centers all across the world. Among its findings was the revelation that at least 30% of data center servers have been idle in excess of six months. They are, as the study puts it, ‘comatose.’ And there are at least ten million of them.
The Domain Name Service is an essential part of the complex system that lets users connect to your website or application. It allows users to put a human readable URL — like cartika.com — into their browser and be connected to the server represented by that address. Essentially DNS translates URLs into IP addresses; it can be thought of as the Internet's address book. Without DNS, there would be no way for users to connect to your site unless they already knew its IP address. Learn More about Cartika AnyCast DNS
In the never ending battle against spam, the resultant eco-system has generated some really interesting dynamics. Often times, organizations working in synergy to try and address and resolve spamming issues, and most importantly, keep legitimate email flowing to users, get caught in tug and war battles. Often times, this is nothing more then newer players coming into a space they know little about and attempt to make their mark by flexing some muscle – when all that is actually required is a little common sense and an ability to work with each other. This is what is happening right now with SpamRats.
Cartika is pleased to announce that QuickSilk - an enterprise-level, subscription-based CMS and hosting platform designed for simplicity - has chosen us as their official host for its SaaS offering. By tapping into our years of experience as a host, we’ve set QuickSilk up with a powerful, reliable, and affordable cloud platform that it can use to deliver QuikSilk to their clients. Full redundancy ensures downtime is negligible, and automatic backup through Bacula4 allows them to recover from even the most catastrophic bugs But enough about us - let’s talk a little bit about them.
The web is a very different place from what it used to be. Trends such as mobile usage, semantic search, and social media have brought about a fundamental change in how users both seek out and consume content. Search engine optimization has had to change as a necessity - and that evolution has caused it to very closely intersect with web design.
Lets say you have very popular website. It’s been chugging along fine for the last couple of months; your server has handled the load perfectly well. But as your site becomes ever more popular, the server starts to show the strain — too many connections swamp the available memory, pages load slowly, and sometimes not at all. It’s time for an upgrade.
There’s probably no one with access to the Internet who isn’t aware that the security of Apple’s iCloud platform was called into question recently. I’m not going discuss the appalling theft of private data that ensued, but I do want to look at a related issue: rate limiting. While we’re not entirely sure of the cause of the leak of celebrity’s private photos—the likely strategy was simple social engineering, research of publicly available information, and the exploitation of poor password choices—we do know that around the same time a vulnerability was discovered in iCloud that made life much easier for any potential hackers.
The first time a user visits your site, it’s likely that they won’t have a DNS mapping for your IP stored in their browser cache, and it’s possible their ISP doesn’t have a result cached either. For many of your visitors, the Domain Name System will have to retrieve and return the DNS record from the authoritative server for your domain. That takes time, and since DNS is such a fundamental part of how the Internet works, we want to keep the amount of time it takes to a minimum. There’s no point having a well-optimized site on great hosting if it takes several seconds for your browser to find out where it should be sending requests.
When a popular site switches content management systems, particularly a site like CMS Critic, whose writers we can expect to be well-informed of content management issues, it’s useful to have a look at the reasons behind the change. At the very least, they serve as input for future site deployment decisions. Early in July, CMS Critic, which is owned by Mike Johnston, made the jump from WordPress to ProcessWire, an open source content management system that offers many of WordPress’s benefits. I wasn’t very familiar with ProcessWire, but I am familiar with WordPress, so I’d like to take a look at CMS Critic’s reasoning, consider whether their complaints about WordPress are entirely fair, and whether ProcessWire does, in fact, make a good WordPress alternative for the average WordPress user.
Site security is a complex issue. The online economy is huge and hackers stand to reap considerable benefits from attacks against sites that store sensitive data or give them access to large numbers of visitors. Hackers are a motivated and intelligent group of people, albeit a group with a consistent lack of concern for their fellow Internet users. In spite of the potential complexity of securing a site, attacks tend to fall into a number of clearly defined categories, and the mitigation of a significant majority of attacks can be achieved by following a small set of best practices. That’s not to say that by implementing the strategies we’re going to discuss here a site will be rendered impervious – that’s all but impossible, but most hackers focus on low hanging fruit, and by ensuring that a site is difficult to exploit, web masters will discourage all but the most persistent online criminals.
Occasionally, I wonder what might happen if the Internet just stopped working one day. It’s not a terribly pleasant thought, is it? These days, we’re so reliant on our connectivity that if some outside force were to strip it away from us, it’d likely lead to a complete societal collapse. There are upsides to this reliance, of course – particularly if you’re in the field of web development. If you’re capable of stomaching the learning cliff and the long hours you’ll likely end up working, there’s never been a better time to be a web developer. So long as you’ve got the right knowledge and skills under your belt, you’ll never be wanting for new clients. After all, as long as the Internet exists, someone’s going to want a website built.
Later this month, the HTTPbis working group will make their last call for input into HTTP 2.0, the first major revision in a decade and a half to the protocol on which the web runs. This November, assuming all goes according to schedule, HTTP 2.0 will be submitted to the Internet Engineering Steering Group for consideration as a proposed standard, after which it’ll travel through the process for adoption as a standard. The aim of HTTP 2.0 is to make the web’s technology more suitable to the way that modern web services and sites work, with particular focus on reducing latency and improving performance. In the late 90s, when the current version of HTTP was developed, the web was a very different place. Most sites were static and served from one server. Today’s websites are dynamic, interactive, and made up of components that reside on many different servers.