Saturday, December 7, 2013

Why anti-virus software doesn't make you safer (and may even make things worse)

Even before I stopped using Windows on a regular basis I'd stopped running antivirus software. I'd still run a scan now and then, but not all the accompanying processes that weave their way into every part of your system and siphon away huge amounts of memory and processing power.

Maybe you think I'm crazy, or at least careless. But the fact is, as a software engineer, I realize how easy it would be to write malicious software that can't be detected by an Internet security suite. It would be difficult for even an experienced programmer to find the malicious element in well-crafted code. Software algorithms can be incredibly complex and, contrary to popular belief, computers aren't smarter than humans. In fact, they're incredibly stupid, and they only do exactly what you tell them to do. All antivirus software can do is scan for known threats. It can't detect anything but the most basic attempt at malicious code.

Admittedly, there is some utility in software that watches for attempts by programs to access certain things on your system, or to upload data to the Internet. But everything comes at a cost, and keeping these processes always running in the background slows your computer down and drains battery life. The fact is that most operating systems come with built in security mechanisms, such as mandatory access controls, that are much more effective and don't require running an additional process.Even Windows is quite secure for a well informed user, without additional security software.

On that last point, the real problem is that users aren't sufficiently aware of what kinds of threats are out there, and how to guard against them. As the saying goes, "There is no patch for human ignorance." And therein lies the bane of every IS professional's existence. People think that they can install an Internet security suite, and it will save them, in spite of their ignorance.

Nowadays, operating systems have a lot of these safe guards already built in to them, and much of what third party programs do is redundant. And yet, with all these redundant protections, successful exploitations abound. This is because you can't write software that can sufficiently compensate for the carelessness of users, without making their systems virtually unusable.

The idea that Linux is more secure because fewer people use it is a myth. Besides its technical superiority, a major reason it's harder to exploit is because its users tend to be better informed. Even with strict file permissions, selinux enforcing, and a well configured firewall, I could easily write a script to steal vital information from a Linux system, if the user is foolish enough to download and run it without knowing what's in it.

Don't think you're safe, simply because you use Mac or Linux!

Here's a list of simple things you can do to protect yourself from most Internet attacks, without installing additional software.

1. Don't run as the administrative user. This mostly applies to Windows users, where I believe this is still the default. If you're running as an administrative user, any process you run can basically do whatever it wants to you computer, and access all your personal information. Microsoft has implemented a labyrinth of complicated access controls to try and compensate for this, which could be more easily solved by running as a less privileged user.

2. Don't download and run things from the Internet that you're not sure you can trust. Even if you run them without admin privileges, this only protects your system. It doesn't protect your personal files.

3. Be very careful about clicking on links or opening attachments in emails. Or even responding to them. This is a whole topic by itself, and probably the biggest source of successful exploitations. Spend some time getting informed on the tricks attackers use. Even if you recognize the sender's email address, that doesn't prove who it's from.

4. Use good passwords for online accounts. What is a good password? Well first of all, your name is not a good password! It amazes me how many people think it's crazy not to use anti-virus software, and yet use passwords that are ridiculously easy to hack. Email accounts are a major target, and attackers use automated programs that rummage about the web trying passwords until one works. All while they kick back playing computer games and eating popcorn.

Most people think there's nothing all that valuable in their emails anyways. However, you'd be surprised how much information you can get from emails. Personal information is one of an attackers most useful tools. And because most of the people you know probably use passwords as obvious as yours, your emails give them the names and probable passwords of your family and friends. Once inside your account, they can send emails to everyone in your contact list, in an attempt to get more information. Or they could contain malicious attachments.

The biggest reason people don't use good passwords is because they're difficult to remember. I don't even try to remember most of my passwords. There's no need to. You can store them in a password program, like Keepass, which will auto-generate a complex password for you. Most browsers have the ability to store your password in an encrypted form, and sync them between your computers and smart phone.

5. Lock down your smart phone. This is also a whole topic by itself, so I won't go into detail. This is the computer that people tend to be the least careful with, and yet probably poses the greatest vulnerability; and it's the most likely to be lost or stolen. Most people don't even use a password or pin code to lock their phone, and yet always use one on their desktop. Besides following the tips already mentioned, you should also encrypt your phone's storage.

6. And that leads to the next point: Use encryption. This is one of the least used security measures, and yet one of the most important. At the very least, encrypt all your mobile computers, e.g. phones, tablets, laptops. Passwords do nothing once you have the physical system in your possession. You can read straight from the hard drive without even booting the operating system.

7. Be careful about what information you post on social networking sites.  The more an attacker knows about you, the easier it is to steal your information, and even your identity. Not to mention threats from other kinds of predators.

And remember, "there is no patch for human ignorance." Stay informed about the latest tactics attackers are using. Maybe subscribe to an internet security newletter. And read those emails you get from your company's IT department.

Tuesday, May 21, 2013

Ubuntu Switching to QT

So you may have already heard the news that Ubuntu will be switching to Qt as the primary graphical toolkit for Unity. This will be for the next edition of Unity, which is called, appropriately, "Unity Next." The intent is to have a unified OS across platforms, and match the desktop to Ubuntu's mobile version, which is already built on Qt. This change is planned to be in place by the next LTS version, 14.04.

Not much detail has been given about this change, and what it means for the numerous GTK applications that currently form the core of the Unity desktop. Will the default applications all be Qt-based, perhaps requiring the installation of a number of KDE libraries? Will GTK applications still be supported?

Overall, I think this is an exciting development, though the timeline seems pretty ambitious. I've always thought Qt was superior to GTK, and has a far more mature collection of development tools available for it.  Browsing the preliminary developer's documentation for Unity Next, it's evident that the availability of a powerful, ready-made, IDE has also had a significant influence on Canonical's choosing the Qt framework.

The official IDE, it appears, for Unity Next will be Qt Creator. What's interesting about this is that Qt creator is designed primarily for coding in C++. This suggests that this will become the primary app development language for Ubuntu. Previously, and quite recently, Canonical had indicated that Python would be the language of choice for app development in Ubuntu. However, while there are two very good Python Qt binding APIs available (PyQt and PySide), Qt Creator only directly supports C++. So does this signal another major switch?

Qt Creator also supports application development with QML, which was the basis for the now defunct Unity 2D desktop. My guess is that Canonical's previous experience with QML has had a lot to do with their decision to make the switch. According to the Unity Next specification page, the new UI "was developed largely" using QML.

Of course, there are many other tools available for Qt development. Qt Designer provides a drag-and-drop interface for creating the UI, but doesn't include support for code generation or editing, and therefore is not a true IDE. Nokia also provides a Qt plugin for Eclipse IDE, that functions much like Qt Creator, though I believe it also only supports coding in C++. However, Qt Designer can be used to create the UI, and then the actually coding can be done separately in Python, or any language with Qt bindings.

 Still the question remains: Has Canonical abandoned it's relatively recent plan to use Python as it's primary app development language? Also, how well will Python, or other languages, be supported on Ubuntu's various desktop and mobile platforms?

It hasn't been unusual, in recent years, for Canonical to announce significant decisions regarding the future of Ubuntu, only to change course soon after. The official sanction of Python, and the plan to adopt the Wayland display server are two well-known examples. The latter has now been discarded, in favor of Canonical's in-house display server, Mir.

This, of course, can be unsettling for potential users and developers alike. Understandably, there may be quite a bit of hesitation to commit to something, only to have it pulled out from under you soon after. The question remains: What will become of the applications that have already been created using the current development defaults, Python and GTK? Obviously, they will still work fine in other Linux desktop environments, but what about Ubuntu? How will Canonical discriminate between apps for mobile and apps for desktop?

 All that's been said, so far, is that Qt/QML will replace the Nux toolkit, currently used for Unity's 3D desktop components. Built-in support for OpenGL in Qt5 has made using Nux unnecessary. This means that Ubuntu can use a single unified toolkit for both the Unity interface and user applications. Supposedly, then, we will begin to see Qt5 implemented in Ubuntu, sometime during the next year, preceding the release of 14.04 LTS.

This seems to signal the demise of Compiz Fusion on Ubuntu, which I'm not sad to see go. While it's added a lot of fun factor to the GTK-based Linux desktop, I've always found it to be a resource hungry, unstable, piece of software. Unfortunately, however, a lot of Unity's configurability will probably go with it. A pattern that has become prevalent on the Linux desktop over the last couple of years.

I stated in a previous post, from about two years ago, that I thought Ubuntu ought to move to Qt, and more than anything needed better development tools. So, as you might imagine, I'm glad to see things finally taking shape in that direction.

However, at the same time I'm hesitant to commit to something that's still so much in flux. Minus the time invested in learning and configuring the system, 12.04 LTS is a pretty safe choice, but I have little interest in developing Ubuntu apps until things settle down, and the long-term roadmap becomes clearer.

Currently, I'm running Fedora 18 on my work laptop, and have been quite happy with it. Since I develop software for the RHEL 6 platform, Fedora seemed a practical choice. I have Ubuntu 12.04 installed on my personal laptop, though I don't use it much these days.

I'm excited to see things take shape, though it will be interesting to see how long it takes to really stabilize. It will be particularly interesting, as 14.04 will be both a revolutionary change, and an LTS (long-term support) release for Ubuntu. The desktop has also gained significant ground with computer vendors, with high-end laptops currently being offered by Dell and System 76.

Canonical has announced its intention to make available its own SDK, and has already released a preview. The goal of making apps usable on all Ubuntu platforms raises the question: what will happen to the apps that aren't compliant with this goal? How will current popular Linux applications, not developed with Ubuntu SDK, function in Ubuntu Next?

While the future of Ubuntu promises some exciting changes, there's still a lot of mystery in the details; and those details could lead to some major headaches for users. Regardless, I'm glad to see Linux personal computing moving forward with such innovative developments. A truly open mobile platform, that can double as a desktop OS, sounds pretty awesome. Standard support for native language coding is also sure to attract serious coders.