IAAS, CLOUD, AND MANAGED SERVICES BLOG
It’s something of a disconcerting statistic - fewer than one third of American financial organizations have a cloud strategy. Other regions don’t fare much better, either. EMEA shows 35% of firms preparing for cloud computing, while APAC is slightly higher, at 41%. Given the immense popularity of cloud computing in both enterprise and the consumer space, these statistics are troubling - even if they do show that more institutions are starting to realize the importance of going digital. What exactly is the root cause here? Why exactly are firms lagging behind to such a great degree? And more importantly, what can be done about it?
Some of our customers are facing and are impacted by the recent round of "Forex" site injections. The typical symptoms are site injections redirecting users to a "forex" landing page. Content can either be injected into existing pages, or, the injected bots can delete the content entirely and replace the content with code which accomplishes the same thing. The majority of the reported infections exploited a vulnerability in out-of-date Joomla and Wordpress core, plugins, modules and templates. Infections leverage publicly known vulnerabilities in WordPress, WHMCS, and Joomla enabled servers, and other customized dynamic PHP/ASP/SQL web applications. Database injections, via these exploits is also possible, and can act as a back door to re-inject websites after they have been cleaned once. I wanted to take a few minutes and discuss what Cartika is doing to help our customers, and what customers should be doing to deal with this situation if you have been impacted by it
Cartika is pleased to introduce New Relic analytics integration and roll out (phased roll out outlined below). We are very excited about this roll out and what it will mean for both our customers, and internally for our support staff to have access to this data. The benefits it will present to our customers is simply enormous. Developers, Sys Admins and DBA's will gain invaluable insights into the health of their environments, be able to more quickly identify problems, streamline their resource usage and make educated decisions about capacity and capacity planning. Internally, we are very excited to provide these sorts of tools and analytics to our support staff. Their ability to help customers identify issues, identify bad plugins or bad code and provide advise and consultations regarding upgrading strategies, code optimization and various other day to day issues is simply increased exponentially and dramatically. Their ability to identify such issues is expedited by presenting them the data in real time, in an easy to understand and clearly defined interface. We are enabling our support staff, with the tools they need, to quickly and efficiently provide the superior level of service and support we demand from our team.
A troubling report was published earlier this month by IT consulting firm Antithesis Group. Through a partnership with Stanford University and TSO Logic, the firm examined the state of data centers all across the world. Among its findings was the revelation that at least 30% of data center servers have been idle in excess of six months. They are, as the study puts it, ‘comatose.’ And there are at least ten million of them.
Earlier this year, some of the world’s leading experts on artificial intelligence met in Puerto Rico for a private conference. The purpose? To determine whether or not intelligent machines would be good for human society or bad. Not surprisingly, IBM’s Watson Supercomputer was a central topic of discussion. First developed in 2005 by IBM Research, Watson enjoyed its first real moment in the spotlight when it defeated Jeopardy winners Brad Rutter and Ken Jennings. From there, it experienced a meteoric rise to fame, finding its footing in a host of different fields - healthcare among them.
The Domain Name Service is an essential part of the complex system that lets users connect to your website or application. It allows users to put a human readable URL — like cartika.com — into their browser and be connected to the server represented by that address. Essentially DNS translates URLs into IP addresses; it can be thought of as the Internet's address book. Without DNS, there would be no way for users to connect to your site unless they already knew its IP address. Learn More about Cartika AnyCast DNS
In the never ending battle against spam, the resultant eco-system has generated some really interesting dynamics. Often times, organizations working in synergy to try and address and resolve spamming issues, and most importantly, keep legitimate email flowing to users, get caught in tug and war battles. Often times, this is nothing more then newer players coming into a space they know little about and attempt to make their mark by flexing some muscle – when all that is actually required is a little common sense and an ability to work with each other. This is what is happening right now with SpamRats.
Cartika is pleased to announce that QuickSilk - an enterprise-level, subscription-based CMS and hosting platform designed for simplicity - has chosen us as their official host for its SaaS offering. By tapping into our years of experience as a host, we’ve set QuickSilk up with a powerful, reliable, and affordable cloud platform that it can use to deliver QuikSilk to their clients. Full redundancy ensures downtime is negligible, and automatic backup through Bacula4 allows them to recover from even the most catastrophic bugs But enough about us - let’s talk a little bit about them.
The web is a very different place from what it used to be. Trends such as mobile usage, semantic search, and social media have brought about a fundamental change in how users both seek out and consume content. Search engine optimization has had to change as a necessity - and that evolution has caused it to very closely intersect with web design.
Last month, Anthem Incorporated - one of the world’s leading health insurance companies - made a very grim announcement to shareholders and clients. It was, a representative explained, the target of a “very sophisticated external cyberattack,” which allowed hackers to gain unauthorized access to its IT systems. The personal information of eighty million clients - data ranging from birthdays and names to medical IDs, social security numbers, street addresses, email addresses, and employment history - was compromised.
Lets say you have very popular website. It’s been chugging along fine for the last couple of months; your server has handled the load perfectly well. But as your site becomes ever more popular, the server starts to show the strain — too many connections swamp the available memory, pages load slowly, and sometimes not at all. It’s time for an upgrade.
When we think about artificial intelligence, what comes to mind is usually none-too-friendly fictional examples like HAL and the machines from The Matrix, although there are some more friendly AIs, like Data from Star Trek. Friendly or not, we’re a long way from having functional AIs of this type, but over the next few years, most of us are going to be relying on a narrow sort of AI: AIs capable of organizing a limited subset of data in useful ways.
If you've been in the IT industry for a while, you'll have an almost instinctive familiarity with what the cloud is, its various modalities, deployment models, and types. Intuitively, one would think that a deep understanding would make the cloud easy to explain to less technical people, but in fact the opposite is true. It's very difficult to put yourself in the mindset of someone who lacks the conceptual framework that those of us who have been around enterprise IT for a long time have developed.
There has always been a lot of confusion around the exact meanings of the various cloud service models and their intersection with deployment strategies. That's hardly surprising given that IaaS, PaaS, SaaS, public cloud, private cloud, hybrid cloud, and a dozen other as-a-service modalities are a complex combination of marketing speak and technical jargon. In this article, I'd like to tease out one confused strand: the relationship between Infrastructure-as-a-Service and public or private cloud deployments. I've chosen to address this topic because there's often considerable confusion around what a private cloud is: I've heard people say that a private cloud can't involve virtualization, that its just another name for traditional in-house deployments, that it's a form of colocation, that Google Apps is a private cloud, and so on — none of which are remotely accurate or at least not completely so.
Even the smallest of modern companies use networks that are both heterogeneous and dispersed. Business networks are composed of multiple services spread over many servers in diverse locations. I'm a writer, so you'd think I could make do without much of a network, but when I add up all the services I use to run my small business, I find that I rely on an extensive network of personal computers, mobile devices, backup servers, file servers, cloud storage servers, virtual private servers, SaaS applications, web hosting servers, and email services; hosted in the cloud, in my home, and on traditional hosting; and distributed all over Europe and the US.
There’s a dream of the cloud in which data flows freely around the globe, available anywhere, stored wherever is convenient, and detached from the normal concerns of information management. Technologically, companies don’t have to care about where their data is stored: it’s in the cloud and the cloud encourages users to be agnostic about which server, which data center, and even which country their data is housed in. But, legally and politically, the location of data matters a lot.
If there’s one thing that’s obvious to anyone who’s spent even a little bit of time online, it’s that security is one of the biggest hot-button issues on the modern web. As we store more and more information online, cyber-attacks are becoming increasingly lucrative - and the stakes involved in securing our data are rising ever higher. Not surprisingly, that means cyber-criminals are getting smarter and craftier. Whereas before a business might have to deal with the odd DDOS or man-in-the-middle attack, now there’s a constant risk that someone might jump in to exploit even the smallest security hole. It’s a culture of not-completely-unjustified paranoia - particularly since it seems as though many organizations aren’t pulling their weight as far as protecting their data is concerned.
There’s probably no one with access to the Internet who isn’t aware that the security of Apple’s iCloud platform was called into question recently. I’m not going discuss the appalling theft of private data that ensued, but I do want to look at a related issue: rate limiting. While we’re not entirely sure of the cause of the leak of celebrity’s private photos—the likely strategy was simple social engineering, research of publicly available information, and the exploitation of poor password choices—we do know that around the same time a vulnerability was discovered in iCloud that made life much easier for any potential hackers.
The first time a user visits your site, it’s likely that they won’t have a DNS mapping for your IP stored in their browser cache, and it’s possible their ISP doesn’t have a result cached either. For many of your visitors, the Domain Name System will have to retrieve and return the DNS record from the authoritative server for your domain. That takes time, and since DNS is such a fundamental part of how the Internet works, we want to keep the amount of time it takes to a minimum. There’s no point having a well-optimized site on great hosting if it takes several seconds for your browser to find out where it should be sending requests.