Sec Tools

AV is dead … again …

Antivirus software only catches 45% of malware attacks and is “dead”, according to a senior manager at Symantec.”

85.4% of statistic can be interpreted in the opposite way, and AV has been declared dead regularly since 1987.

Symantec “invented commercial antivirus software in the 1980s”?  That must come as news to the many companies, like Sophos, that I was reviewing long before Symantec bought out their first AV company.

“Dye told the Wall Street Journal that hackers increasingly use novel methods and bugs in the software of computers to perform attacks.”

There were “novel attacks” in 1986, and they got caught.  There have been novel attacks every year or so since, and they’ve been caught.  At the same time, lots of people get attacked and fail to detect it.  There’s never a horse that couldn’t be rode, and there’s never a rider that couldn’t be throwed.

“Malware has become increasingly complex in a post-Stuxnet world.”

So have computers.  Even before Stuxnet.  I think it was Grace Hopper who said that the reason it is difficult to secure complex systems is because they are complex systems.  (And she died a while back.)

CyberSec Tips: Malware – advice for the sysadmin

This is possibly a little out of line with what I’m trying to do with the series.  This advice is aimed a little higher than the home user, or small business operator with little computer experience.  Today I got these questions from someone with an advanced computer background, and solid security background, but no malware or antivirus experience.  I figured that this might apply to a number of people out there, so here was my advice:

 

> Question 1: What is the best way to obtain some good virus samples to
> experiment with in a clean-room environment?

Just look for anything large in your spam filters  :-)

> What I see doing is setting up a VM that is connected to an isolated
> network (with no connection to any other computer or the internet except
> for a computer running wireshark to monitor any traffic generated by the
> virus/malware).

VMs are handy when you are running a wholesale sample gathering and analysis operation, but for a small operation I tend not to trust them.  You might try running Windows under a Mac or Linux box, etc.  Even then, some of the stuff is getting pretty sneaky, and some specifically target VMs.  (I wonder how hard it would be to run Windows in a VM under iOS on ARM?)

> Also, any other particular recommendations as to how to set up the
> clean-room environment?

I’m particularly paranoid, especially if you haven’t had a lot of background in malware, so I’d tend to recommend a complete airgap, with floppies.  (You can still get USB 3 1/2″ floppy drives.)  CDs might be OK, but USB drives are just getting too complex to be sure.

> Question 2: What products are recommended for removing viruses and malware
> (i.e. is there a generic disinfector program that you recommend)?

I wouldn’t recommend a generic for disinfection.  For Windows, after the disaster of MSAV, MSE is surprisingly good, and careful–unlikely to create more problems than it solves.  I like Avast these days: even the free version gives you a lot of control, although it seems to be drifting into the “we know what’s best for you” camp.  And Sophos, of course, is solid stuff, and has been close to the top of the AV heap for over two decades.  F-Secure is good, although they may be distracted by the expansion they are doing of late.  Kaspersky is fine, though opinionated.  Eset has long had an advantage in scanning speed, but it does chew up machine cycles when operating.

Symantec/Norton, McAfee, and Trend have always had a far larger share of the market than was justified by their actual products.

As always, I recommend using multiple products for detection.

> I assume the preferred approach is to boot the suspect computer from USB
> and to run the analysis/disinfection software from the USB key (i.e. not to boot
> the infected computer until it has been disinfected).

A good plan.  Again, I might recommend CD/DVD over USB keys, but, as long as you are careful that the USB drive is clean …

> Question 3: How/when does one make the decision to wipe the hard drive and
> restore from backup rather than attempt to remove the malware?

If you have an up-to-date backup, that is always preferred when absolute security is the issue.  However, the most common malware is going to be cleanable fairly easily.  (Unless you run into some of the more nasty ransomware.)

Pushing backup, and multiple forms of backup, on all users and systems, is a great idea for all kinds of problems.  I’ve got a “set and forget” backup running to a USB drive that automatically updates any changes about every fifteen minutes.  And every couple of days I make a separate backup (and I have different USB drives I do it to) of all data files–which I then copy on to one of the laptops.  I just use an old batch file I created, which replaces any files with newer versions.  (Since it doesn’t delete anything I don’t change, it also means I have recovery possibilities if I make a mistake with deleting anything, and, by using multiple drives, I can rotate them for offsite storage, and even have possibilities of recovering old versions.)

> Question 4: Any recommended books or other guides to this subject matter?

Haven’t seen anything terrifically useful recently, unfortunately.  David Harley and I released “Viruses Revealed” as public domain a few years back, but it’s over ten years old.  (We released it about the time a vxer decided to upload it to http://vxheavens.com/lib/ars08.html  He probably thought he was hurting our sales, but we figured he was doing us a favour  :-)

CyberSec Tips: Email – Spam – check your filters

Spam filters are getting pretty good these days.  If they weren’t, we’d be inundated.

But they aren’t perfect.

It’s a good idea to check what is being filtered out, every once in a while, to make sure that you are not missing messages you should be getting.  Lots of things can falsely trigger spam filters these days.

Where and how you check will depend on what you use to read your email.  And how you report that something is or isn’t spam will depend on that, too.

If you use the Web based email systems, like Gmail, Yahoo, Outlook/Hotmail, or others, and you use their Web interface, the spam folder usually is listed with other folders, generally to the left side of the browser window.  And, when you are looking at that list, when you select one of the messages, somewhere on the screen, probably near the top, is a button to report that it isn’t spam.

It’s been a couple of weeks since I did this myself, so I checked two of my Webmail accounts this morning.  Both of them had at least one message caught in the spam trap that should have been sent through.  Spam filtering is good, but it isn’t perfect.  You have to take responsibility for your own safety.  And that means checking the things you use to keep you safe.

Review of “cloud drives” – Younited – pt 3

Yesterday I received an update for the Younited client–on the Win7 machine.  The XP machine didn’t update, nor was there any option to do so.

This morning Younited won’t accept the password on the Win7 machine: it won’t log on.  Actually, it seems to be randomly forgetting parts of the password.  As with most programs, it doesn’t show the password (nor is there any option to show it), the password is represented by dots for the characters.  But I’ll have seven characters entered (with seven dots showing), and, all of a sudden, only three dots will be showing.  Or I’ll have entered ten, and suddenly there are only two.

REVIEW: “Debug It: Find, Repair, and Prevent Bugs in Your Code”, Paul Butcher

BKDEBGIT.RVW   20130122

“Debug It: Find, Repair, and Prevent Bugs in Your Code”, Paul Butcher, 2009, U$34.95/C$43.95, 978-1-93435-628-9
%A   Paul Butcher paul@paulbutcher.com
%C   2831 El Dorado Pkwy, #103-381 Frisco, TX 75033
%D   2009
%G   978-1-93435-628-9 1-93435-628-X
%I   Pragmatic Bookshelf
%O   U$34.95/C$43.95 sales@pragmaticprogrammer.com 800-699-7764
%O  http://www.amazon.com/exec/obidos/ASIN/193435628X/robsladesinterne
http://www.amazon.co.uk/exec/obidos/ASIN/193435628X/robsladesinte-21
%O   http://www.amazon.ca/exec/obidos/ASIN/193435628X/robsladesin03-20
%O   Audience n- Tech 2 Writing 1 (see revfaq.htm for explanation)
%P   214 p.
%T   “Debug It: Find, Repair, and Prevent Bugs in Your Code”

The preface states that there are lots of books in the market that teach development and few that teach debugging.  (In my experience, a great many development books include debugging advice, so I’m not sure where the author’s perception comes from.)  The work is structured around a core method for debugging: reproduce, diagnose, fix, and reflect.

Part one presents the basic technique.  Chapter one repeats the description of this core method.  Chapter two encourages the reproduction of the bug.  (This can be more complex than the author lets on.  I have a netbook with some bug in the hibernation function.  Despite constant observation over a period of three and a half years, I’ve yet to find a combination of conditions that reproduces the failure, nor one that prevents it.)  Some of the suggestions given are useful, if pedestrian, while others are pretty pointless.  (Butcher does not address the rather thorny issue of using “real” data for testing.)  In terms of diagnosis, in chapter three, there is limited description of process, but lots of good tips.  The same is true of fixing, in chapter four.  (I most definitely agree with the recommendation to fix underlying causes, rather than effects.)  Reflection, the topic of chapter five, is limited to advice that the problem be considered even after you’ve fixed it.

Part two explores the larger picture.  Chapter six examines bug tracking systems, and eliciting feedback from users and support staff.  Chapter seven advises on trying to address the bugs, but concentrates on “fix early,” with little discussion of priorities or ranking systems.

Part three, entitled “Debug Fu,” turns to related and side issues.  The “Special Cases” in chapter eight seem to be fairly common: software already released, compatibility issues, and “heisenbugs” that disappear when you try to track them.  Chapter nine, on the ideal debugging environment, is about as practical as most such exercises.  “Teach Your Software to Debug Itself” in chapter ten seems confined to a few specific cases.  Chapter eleven notes some common problems in development teams and structures.

The advice in the book is good, and solid, but not surprising to anyone with experience.  Novices who have not considered debugging much will find it useful.

copyright, Robert M. Slade   2013   BKDEBGIT.RVW   20130122

The Common Vulnerability Scoring System

Introduction

This article presents the Common Vulnerability Scoring System (CVSS) Version 2.0, an open framework for scoring IT vulnerabilities. It introduces metric groups, describes base metrics, vector, and scoring. Finally, an example is provided to understand how it works in practice. For a more in depth look into scoring vulnerabilities, check out the ethical hacking course offered by the InfoSec Institute.

Metric groups

There are three metric groups:

I. Base (used to describe the fundamental information about the vulnerability—its exploitability and impact).
II. Temporal (time is taken into account when severity of the vulnerability is assessed; for example, the severity decreases when the official patch is available).
III. Environmental (environmental issues are taken into account when severity of the vulnerability is assessed; for example, the more systems affected by the vulnerability, the higher severity).

This article is focused on base metrics. Please read A Complete Guide to the Common Vulnerability Scoring System Version 2.0 if you are interested in temporal and environmental metrics.

Base metrics

There are exploitability and impact metrics:

I. Exploitability

a) Access Vector (AV) describes how the vulnerability is exploited:
– Local (L)—exploited only locally
– Adjacent Network (A)—adjacent network access is required to exploit the vulnerability
– Network (N)—remotely exploitable

The more remote the attack, the more severe the vulnerability.

b) Access Complexity (AC) describes how complex the attack is:
– High (H)—a series of steps needed to exploit the vulnerability
– Medium (M)—neither complicated nor easily exploitable
– Low (L)—easily exploitable

The lower the access complexity, the more severe the vulnerability.

c) Authentication (Au) describes the authentication needed to exploit the vulnerability:
– Multiple (M)—the attacker needs to authenticate at least two times
– Single (S)—one-time authentication
– None (N)—no authentication

The lower the number of authentication instances, the more severe the vulnerability.

II. Impact

a) Confidentiality (C) describes the impact of the vulnerability on the confidentiality of the system:
– None (N)—no impact
– Partial (P)—data can be partially read
– Complete (C)—all data can be read

The more affected the confidentiality of the system is, the more severe the vulnerability.

+b) Integrity (I) describes an impact of the vulnerability on integrity of the system:
– None (N)—no impact
– Partial (P)—data can be partially modified
– Complete (C)—all data can be modified

The more affected the integrity of the system is, the more severe the vulnerability.

c) Availability (A) describes an impact of the vulnerability on availability of the system:
– None (N)—no impact
– Partial (P)—interruptions in system’s availability or reduced performance
– Complete (C)—system is completely unavailable

The more affected availability of the system is, the more severe the vulnerability.

Please note the abbreviated metric names and values in parentheses. They are used in base vector description of the vulnerability (explained in the next section).

Base vector

Let’s discuss the base vector. It is presented in the following form:

AV:[L,A,N]/AC:[H,M,L]/Au:[M,S,N]/C:[N,P,C]/I:[N,P,C]/A:[N,P,C]

This is an abbreviated description of the vulnerability that brings information about its base metrics together with metric values. The brackets include possible metric values for given base metrics. The evaluator chooses one metric value for every base metric.

Scoring

The formulas for base score, exploitability, and impact subscores are given in A complete Guide to the Common Vulnerability Scoring System Version 2.0 [1]. However, there in no need to do the calculations manually. There is a Common Vulnerability Scoring System Version 2 Calculator available. The only thing the evaluator has to do is assign metric values to metric names.

Severity level

The base score is dependent on exploitability and impact subscores; it ranges from 0 to 10, where 10 means the highest severity. However, CVSS v2 doesn’t transform the score into a severity level. One can use, for example, the FortiGuard severity level to obtain this information:

FortiGuard severity level CVSS v2 score
Critical 9 – 10
High 7 – 8.9
Medium 4 – 6.9
Low 0.1 – 3.9
Info 0

Putting the pieces together

An exemplary vulnerability in web application is provided to better understand how Common Vulnerability Scoring System Version 2.0 works in practice. Please keep in mind that this framework is not limited to web application vulnerabilities.

Cross-site request forgery in admin panel allows adding a new user and deleting an existing user or all users.

Let’s analyze first the base metrics together with the resulting base vector:

Access Vector (AV): Network (N)
Access Complexity (AC): Medium (M)
Authentication (Au): None (N)

Confidentiality (C): None (N)
Integrity (I): Partial (P)
Availability (A): Complete (C)

Base vector: (AV:N/AC:M/Au:N/C:N/I:P/A:C)

Explanation: The admin has to visit the attacker’s website for the vulnerability to be exploited. That’s why the access complexity is medium. The website of the attacker is somewhere on the Internet. Thus the access vector is network. No authentication is required to exploit this vulnerability (the admin only has to visit the attacker’s website). The attacker can delete all users, making the system unavailable for them. That’s why the impact of the vulnerability on the system’s availability is complete. Deleting all users doesn’t delete all data in the system. Thus the impact on integrity is partial. Finally, there is no impact on the confidentiality of the system provided that added user doesn’t have read permissions on default.

Let’s use the Common Vulnerability Scoring System Version 2 Calculator to obtain the subscores (exploitability and impact) and base score:

Exploitability subscore: 8.6
Impact subscore: 7.8
Base score: 7.8

Let’s transform the score into a severity level according to FortiGuard severity levels:

FortiGuard severity level: High

Summary

This article described an open framework for scoring IT vulnerabilities—Common Vulnerability Scoring System (CVSS) Version 2.0. Base metrics, vector and scoring were presented. An exemplary way of transforming CVSS v2 scores into severity levels was described (FortiGuard severity levels). Finally, an example was discussed to see how all these pieces work in practice.

Dawid Czagan is a security researcher for the InfoSec Institute and the Head of Security Consulting at Future Processing.

Fuzzing Samsung Kies

Android fuzzing is always fun – seems that whenever we fuzz an android app it crashes within seconds.

Samsung Kies was no different. With the help of the talented Juan Yacubian (who built the Kies module in no time) we launched beSTORM against Kies… And saw it crash in record 23 seconds (just over 1,100 attack combinations).

Next on the agenda: install gdb for Android and build the proper payload.

Samsung Kies Crash

 

REVIEW: “Security and Privacy for Microsoft Office 2010 Users”, Mitch Tulloch

BKSCPRO2.RVW   20121122

“Security and Privacy for Microsoft Office 2010 Users”, Mitch Tulloch,
2012, 0735668833, U$9.99
%A   Mitch Tulloch info@mtit.com www.mtit.com
%C   1 Microsoft Way, Redmond, WA   98052-6399
%D   2012
%G   0735668833
%I   Microsoft Press
%O   U$9.99 800-MSPRESS fax: 206-936-7329 mspinput@microsoft.com
%O  http://www.amazon.com/exec/obidos/ASIN/0735668833/robsladesinterne
http://www.amazon.co.uk/exec/obidos/ASIN/0735668833/robsladesinte-21
%O   http://www.amazon.ca/exec/obidos/ASIN/0735668833/robsladesin03-20
%O   Audience n- Tech 1 Writing 1 (see revfaq.htm for explanation)
%P   100 p.
%T   “Security and Privacy for Microsoft Office 2010 Users”

Reducing the complex jargon in the introduction to its simplest terms, this book is intended to allow anyone who uses the Microsoft Office 2010 suite, or the online Office 365, to effectively employ the security functions built into the software.  Chapter one purports to present the “why” of security, but does a very poor job of it.  Company policy is presented as a kind of threat to the employee, and this does nothing to ameliorate the all-too-common perception that security is there simply to make life easier for the IT department, while it makes work harder for everyone else.

Chapter two examines the first security function, called “Protected View.”  The text addresses issues of whether or not you can trust a document created by someone else, and mentions trusted locations.  (Trusted locations seem simply to be defined as a specified directory on your hard drive, and the text does not discuss whether merely moving an unknown document into this directory will magically render it trustworthy.  Also, the reader is told how to set a trusted location, but not an area for designating untrusted files.)  Supposedly “Protected View” will automatically restrict access to, and danger from, documents you receive from unknown sources.  Unfortunately, having used Microsoft Office 2010 for a couple of years, and having received, in that time, hundreds of documents via email and from Web sources, I’ve never yet seen “Protected View,” so I’m not sure how far I can trust what the author is telling me.  (In addition, Tulloch’s discussion of viruses had numerous errors: Concept came along five years before Melissa, and some of the functions he attributes to Melissa are, in fact, from the CHRISTMA exec over a decade earlier.)

Preparation of policy is promised in chapter three, but this isn’t what most managers or security professionals would think of as policy: it is just the provision of a function for change detection or digital signatures.  It also becomes obvious, at this point, that Microsoft Office 2010 and Office 365 can have significantly different operations.  The material is quite confusing with references to a great many programs which are not part of the two (2010 and 365) MS Office suites.

Chapter four notes the possibility of encryption with a password, but the discussion of rights is unclear, and a number of steps are missing.

An appendix lists pointers to a number of references at Microsoft’s Website.

The utility of this work is compromised by the fact that it provides instructions for functions, but doesn’t really explain how, and in what situations, the functions can assist and protect the user.  Any employee using Microsoft Office will be able to access the operations, but without understanding the concepts they won’t be able to take advantage of what protection they offer.

copyright, Robert M. Slade   2012     BKSCPRO2.RVW   20121122