IAAS, CLOUD, AND MANAGED SERVICES BLOG
When it comes to managing IT, one of the most important goals historically has been staying "in control." In years gone by, any changes to a user’s internal or hosted IT environment was a “closed’ proposition with most IT managers or Managed Service Providers (MSPs) fighting hard to keep constraints on every aspect of the systems they supported. There were some valid reasons for this at the beginning, but over time users demanded more flexibility while those running IT, became overwhelmed with increasing change requests.
Our recently restructured IaaS cloud server platform was the result of our fifteen year legacy of pioneering new managed application and web hosting services. During this time, we have constantly worked with clients to determine how to best meet their most demanding IT challenges. We have also remained keenly aware of how competitors have been approaching the market as it has evolved.
Some of our customers are facing and are impacted by the recent round of "Forex" site injections. The typical symptoms are site injections redirecting users to a "forex" landing page. Content can either be injected into existing pages, or, the injected bots can delete the content entirely and replace the content with code which accomplishes the same thing. The majority of the reported infections exploited a vulnerability in out-of-date Joomla and Wordpress core, plugins, modules and templates. Infections leverage publicly known vulnerabilities in WordPress, WHMCS, and Joomla enabled servers, and other customized dynamic PHP/ASP/SQL web applications. Database injections, via these exploits is also possible, and can act as a back door to re-inject websites after they have been cleaned once. I wanted to take a few minutes and discuss what Cartika is doing to help our customers, and what customers should be doing to deal with this situation if you have been impacted by it
The Domain Name Service is an essential part of the complex system that lets users connect to your website or application. It allows users to put a human readable URL — like cartika.com — into their browser and be connected to the server represented by that address. Essentially DNS translates URLs into IP addresses; it can be thought of as the Internet's address book. Without DNS, there would be no way for users to connect to your site unless they already knew its IP address. Learn More about Cartika AnyCast DNS
Cartika is pleased to announce that QuickSilk - an enterprise-level, subscription-based CMS and hosting platform designed for simplicity - has chosen us as their official host for its SaaS offering. By tapping into our years of experience as a host, we’ve set QuickSilk up with a powerful, reliable, and affordable cloud platform that it can use to deliver QuikSilk to their clients. Full redundancy ensures downtime is negligible, and automatic backup through Bacula4 allows them to recover from even the most catastrophic bugs But enough about us - let’s talk a little bit about them.
The web is a very different place from what it used to be. Trends such as mobile usage, semantic search, and social media have brought about a fundamental change in how users both seek out and consume content. Search engine optimization has had to change as a necessity - and that evolution has caused it to very closely intersect with web design.
Lets say you have very popular website. It’s been chugging along fine for the last couple of months; your server has handled the load perfectly well. But as your site becomes ever more popular, the server starts to show the strain — too many connections swamp the available memory, pages load slowly, and sometimes not at all. It’s time for an upgrade.
When we think about artificial intelligence, what comes to mind is usually none-too-friendly fictional examples like HAL and the machines from The Matrix, although there are some more friendly AIs, like Data from Star Trek. Friendly or not, we’re a long way from having functional AIs of this type, but over the next few years, most of us are going to be relying on a narrow sort of AI: AIs capable of organizing a limited subset of data in useful ways.
If there’s one thing that’s obvious to anyone who’s spent even a little bit of time online, it’s that security is one of the biggest hot-button issues on the modern web. As we store more and more information online, cyber-attacks are becoming increasingly lucrative - and the stakes involved in securing our data are rising ever higher. Not surprisingly, that means cyber-criminals are getting smarter and craftier. Whereas before a business might have to deal with the odd DDOS or man-in-the-middle attack, now there’s a constant risk that someone might jump in to exploit even the smallest security hole. It’s a culture of not-completely-unjustified paranoia - particularly since it seems as though many organizations aren’t pulling their weight as far as protecting their data is concerned.
When a popular site switches content management systems, particularly a site like CMS Critic, whose writers we can expect to be well-informed of content management issues, it’s useful to have a look at the reasons behind the change. At the very least, they serve as input for future site deployment decisions. Early in July, CMS Critic, which is owned by Mike Johnston, made the jump from WordPress to ProcessWire, an open source content management system that offers many of WordPress’s benefits. I wasn’t very familiar with ProcessWire, but I am familiar with WordPress, so I’d like to take a look at CMS Critic’s reasoning, consider whether their complaints about WordPress are entirely fair, and whether ProcessWire does, in fact, make a good WordPress alternative for the average WordPress user.
Site security is a complex issue. The online economy is huge and hackers stand to reap considerable benefits from attacks against sites that store sensitive data or give them access to large numbers of visitors. Hackers are a motivated and intelligent group of people, albeit a group with a consistent lack of concern for their fellow Internet users. In spite of the potential complexity of securing a site, attacks tend to fall into a number of clearly defined categories, and the mitigation of a significant majority of attacks can be achieved by following a small set of best practices. That’s not to say that by implementing the strategies we’re going to discuss here a site will be rendered impervious – that’s all but impossible, but most hackers focus on low hanging fruit, and by ensuring that a site is difficult to exploit, web masters will discourage all but the most persistent online criminals.
Occasionally, I wonder what might happen if the Internet just stopped working one day. It’s not a terribly pleasant thought, is it? These days, we’re so reliant on our connectivity that if some outside force were to strip it away from us, it’d likely lead to a complete societal collapse. There are upsides to this reliance, of course – particularly if you’re in the field of web development. If you’re capable of stomaching the learning cliff and the long hours you’ll likely end up working, there’s never been a better time to be a web developer. So long as you’ve got the right knowledge and skills under your belt, you’ll never be wanting for new clients. After all, as long as the Internet exists, someone’s going to want a website built.
DNS amplification attacks are one of the most pernicious vulnerabilities in the Internet’s infrastructure and a favored tool of online criminals with an axe to grind or a need to create a distraction. They’re also a useful example of how infrastructure that grows organically over many years can cause problems because of features created in a different time. Even more striking is the fact that if companies and others running DNS servers put their mind to it, DNS amplification attacks could be rendered impossible.
GitHub is a developer’s dream: not just for managing their own code, but for discovering new and exciting scripts, frameworks, and tools to use in their work. Among the tens of thousands of projects, it can be difficult to sort the wheat from the chaff. GitHub’s popularity means that there are plenty of awesome projects, but they can be hard to find amid the dross. In this article, I’d like to highlight six open source projects that have recently caught my interest. The functionality they provide varies, but each deserves consideration for a prominent place in a web developer’s toolbox.
Later this month, the HTTPbis working group will make their last call for input into HTTP 2.0, the first major revision in a decade and a half to the protocol on which the web runs. This November, assuming all goes according to schedule, HTTP 2.0 will be submitted to the Internet Engineering Steering Group for consideration as a proposed standard, after which it’ll travel through the process for adoption as a standard. The aim of HTTP 2.0 is to make the web’s technology more suitable to the way that modern web services and sites work, with particular focus on reducing latency and improving performance. In the late 90s, when the current version of HTTP was developed, the web was a very different place. Most sites were static and served from one server. Today’s websites are dynamic, interactive, and made up of components that reside on many different servers.