Disappearing Acts

Human history is marked with many years that caused people to fear from the unknown, just because it is unknown…
You may think that we have learn by now that we must know things in order to use and trust them …

Well I read a small advisory about NTFS Data Stream.

For those of you that do not know, data streams allow users to set file properties that can store any amount of data, and can be accessed only when you know the name of that stream.

When using a Data stream of NTFS , the original file size or content is not effected, so in fact, I can hide information from other users, that do not know what are the names of the file custom properties.

Yea this issue is very very old, we at SecuriTeam reported it back in 1998. So why is it, that still most AntiVirus out there do not scan these sections ?

Why I can still bypass Quota settings, and evade other users ?
While Microsoft have made a long road from not caring about security issues, to actually fix them, they still do not touch the “by design” security risks, just like when the WMF gate has merged. Now a very old issue is raising again.

So, now it’s time for us to see if Microsoft will wait for a new highly contiguous worm. or we shell see Redmond taking a nice marketing step and fix this by design issue prior to that…

It’s a Mac, It’s KDE, NO!! it’s Microsoft(r) Windows Vista(tm)

I was given the following link to see the new Windows Vista by Microsoft.

Well, I don’t know. It looks like they just did

# cp -r /usr/src/KDE /usr/src/Windows/Vista

And thats after the KDE people did the same to Apple’s Mac.

Now don’t get me wrong, I do not hate Microsoft, it’s just that I do not agree with their EULA, behavior and other issues … That’s why I stopped being their customer few years ago.

Now, I have a question that bugs me a lot, and I would like to ask the people at Redmond: “Why are you always the last to use an already old technology and yet you call it new ?”

Sendmail Silently-Patched Memory Leak [Deprecated]

Regarding my blog on the memory leak in Sendmail, I was wrong.
The patch fixes a minor resource-depletion issue and does not appear to have any security consequences.
I apologize for the mistake, and would like to thank Eric Allman from the sendmail team for the correction.

Ido Kanner,

Sendmail silently fixed a memory leak in the recent multiple vulnerabilities patch.

The problem occurs when a buffer is set to NULL instead of freeing its memory, causing the data to be marked as being used even though there is no variable that stores the data address.

This happens when the original (buf0) buffer and the buf buffer have different addresses.

The fix was as following:
In the file: contrib/sendmail/src/conf.c

- if (buf == NULL)
- {
- buf = buf0;
- bufsize = sizeof buf0;
- }
+ buf = buf0;
+ bufsize = sizeof buf0;

for (;;)
@@ -5281,8 +5278,8 @@
(void) sm_io_fprintf(smioerr, SM_TIME_DEFAULT,
"%s: %s\n", id, newstring);
#endif /* LOG */
- if (buf == buf0)
- buf = NULL;
+ if (buf != buf0)
+ sm_free(buf);
errno = save_errno;

This advisory can be found here: http://www.securiteam.com/unixfocus/5SP0M0UI0G.html

Thinking Different IV

What’s the connection between Microsoft, Intel and AMD?
The answer is that they are all trying to control code execution, such as the type done by exploiting a buffer overflow or a format string vulnerability.

While I do not think that this should be implemented in the OS, it might have been a good idea to implement it on the CPU level.

But there is another way to solve most of the buffer overflows from happening without involving any hardware or operating system in the middle.

The most common problem that causes buffer overflow related problems, is the use of a specific programming language and specific syntax.
That is, most problems in the security world today still happen because someone was “smart” enough to use the C programming language to do something that resulted in a security risk or just a simple bug.

Sure this is the “standard” today, but it does not mean that it’s a good standard.
I keep saying that the use of C is problematic for many years now, and in return I hear many nice explanation why it is not a good idea to stop using the language.

Sure it is the most widely used language out there, and it became a standard, but the language and language structure (syntax) is so bad, that we see on a daily basis new languages that try to fix it without any real success.

Lets see few problems with the C language (and Syntax):

What do you think about the following code ?

if (1== number)
  printf (“And the winner is: %s”, winner);

Here we use 1== number because if we used number==1 and forget one “=”, we will place a value into the variable number, and therefor we will have a bug, and maybe a security risk (off by x, limit check, etc..).

Here is another common code in C:

  char dest [10];
  char src [12]
  strcpy (dest, src);

And we have a buffer overflow on our hands !

But these two problems are very easy to solve (for expert developers).

So how about some real problematic code, that even expert developers may not notice that it happens, and most of you never thought it is possible to do:

memcpy (src ,(*)letsExecuteOurBufferContent, size);

Do you know what this code does ? Other then using memcpy in a wrong manner, it just opened a back door on a machine that used this code. Yup, all I need to do in C to make it a security risk is to use two variables, and one function!
Yes I know that it is possible to do it in other languages as well, but in C this type of code is so common, that many experts will look at it and still will not see the problem in front of their eyes, while on other languages, it might cause a big red light bulb to glow even by the average developer, even if the vulnerability itself is not noticed.

The problems with C are so bad, that even when it is used to compile an interpretor for other languages (and most of the interpreters out there have been written in C/C++) it may create bugs on the byte code/compiled result of what the user have created.

Just take a look at Perl as one of many examples:

Or what about issues with the Java Virtual Machine ? We can even create a Java code that will cause our VM to execute arbitrary code just because it was written in C:

And still we didn’t even scratch the surface of the problem.

Many times there is a code that you need to write in C that look so bad that even using AT&T/INTEL based assembler syntax looks so much clearer and easier to use all of the sudden.

Many times you need to find yourself writing so much code just because you used C/C++, and when you start writing too much code, you start having bugs (the urban legend claims that on every line of code there is at least one bug waiting to surface!)

And many other times “ANSI C” is not portable at all between compilers, so we can experience a lot of problems from data swapping between parameters (thats a security risk BTW!), continuing between code that is unable to be compiled (the best thing we can expect from such problem), DoS condition, or other missbehavior of the program.

And if the above isn’t bad enough, many C/C++ programs out there arrive with some debug information inside, because there are bugs the programmer was unable to locate without a debugger, but to use a debugger you need debug information, but then you find out that things are acting a bit different on the version without the debug information, so you ship the version with the debug information.

So with all of the above problems, and with almost all of the programs and OS’s out there using C, how can you sleep well at night ?!

So lets stay away from C and find better language. TY.