Saturday, June 26, 2004

OT: SCANDAL: Victoria's Secret ...REVEALED?

CollegeHumor.com : New Funny Pictures, Funny Movies, and Funny Hotlinks Daily!

GOOGLE: Google Holiday Logos

Google Holiday Logos

GOOGLE: Google Technology

Google Technology

GOOGLE: Google and Dilbert�Doodle

Google and Dilbert�Doodle

GOOGLE: Google starts a new net craze

Google starts a new net craze - [Sunday Herald]

OT: Atom and Link Typing

ONLamp.com: Atom and Link Typing [Jun. 24, 2004]

GOOGLE: Google has changed the way we research, but what are we missing?


LINUX: de Tocqueville in DoS attack


M$: X-Box 2 Specs Leaked?

News: X-Box 2 Specs Leaked?

VIRUS: Net Infection Turns Web Sites Into Virus Carriers

Mac News: News: Net Infection Turns Web Sites Into Virus Carriers

MAC: iPod Success Creates Market for Trendy Add-Ons

Mac News: Digital Hub: iPod Success Creates Market for Trendy Add-Ons

GOOGLE: More Than 90 Companies Have Had IPOs So Far in 2004


MAC: Apple's 64-bit PowerBook G5 coming, but when?

MacDailyNews - News - Welcome Home

MAC: 3GHz PowerPC G5 Eludes IBM

3GHz PowerPC G5 Eludes IBM

LINUX: Sun fuels open source Java debate

vnunet.com - Sun fuels open source Java debate

SPAM: No end in sight to anti-spam war

Taipei Times - archives

SPAM: Common anti-spam standard ready soon

vnunet.com - Common anti-spam standard ready soon

SPAM: A Spec to Spike Spam?

A Spec to Spike Spam?

SPAM: Apache Spam Fight Hits New Level

Apache Spam Fight Hits New Level

SEC: Anti-Spyware Bill Moves To House Floor

Techweb > News > Internet privacy laws > Anti-Spyware Bill Moves To House Floor > June 25, 2004

SEC: US Congress against spyware

US Congress against spyware

SEC: Spyware: The Next Spam?

Technology News: Security: Spyware: The Next Spam?

MAC: Apple India chief quits

Apple India chief quits - The Times of India


The Citizen - news, entertainment, jobs, homes and cars

SEC: FBI antiterror computer system delayed

CNN.com - Report: FBI antiterror computer system delayed - Jun 26, 2004

VIRUS: Evolution of Computer Viruses V

Kansas City infoZine - Evolution of Computer Viruses V - USA

SEC: Ethical Hacking Is No Oxymoron

Boston.com / Business / Technology / Ethical Hacking Is No Oxymoron

PROG: Briefly on Software

monkeyblah.com--Briefly on Software

Friday, June 25 2004 at 21:25 Hackerdom, Philosophy

Today is a computer-filled world, and computers need software. A computer without software is quite like a engine without a car to go with it; yes, you can run the engine on full speed all day, but it's not going to bring you any closer to wherever you want to go. Simply put, without software, a computer is ultimately useless. Today, computer usage having grown mainstream, we have a vast number of people/corporations out there who stake their future on making software; and it appears that (as I suppose it does in other fields as well) these can be divided into two groups with distinct philosophical outlooks on software and its purpose.

I will illustrate these two, as I usually do, by using two groups with diametrical philosophies. The first example of the day I'm sure you've all heard about: The Microsoft Corporation. Microsoft occupy a large portion of today's software and hardware market, and I'm sure that 99% of all home computers are using their software — typically one of the many Windows operating systems they have under their belt, at the very least.

Now as anyone with some computer society awareness might be able to tell you, Microsoft is a much-maligned company. Throughout the years they have been accused, often accurately, of dealing in shady and unfair practices, many of which are difficult to understand for someone who lacks an engineering education. This was temporarily brought to public attention a few years ago with the famous antitrust case which is still being argued — it will be a long while before we see the end of that, I'm sure — but the majority of the issues people have with Microsoft are too arcane for Average Joe to fully grasp.

It might suffice to describe the practices of Microsoft as highly self-preservative, as there is one common theme that runs through most of them: Attempts to make matters very hard for any potential opponents to gain a foothold in their market. An oft employed tactic is using — some would say abusing, and I may be bound to agree — their position as a de facto monopoly to break compatibility with other systems. It may elude the common user that Windows is not by far the only way to work with a computer — after all, it is most these people are ever exposed to — but there exists a wide plethora of operating systems, hardware setups and various software suites out there, many of which were around long before Microsoft.

In order to preserve some sanity among these many different systems, standards were established, a common practice not exclusive to the computer science industry. These standards were mainly intended to allow for interoperability between systems — for instance, it would be disastrous if text files written on a Macintosh computer were unreadable on an x86 PC — and generally, to preserve compatibility between the myriad of ways to manipulate data on a computer. This is generally regarded as a good thing, for obvious reasons. However, for a corporation with little to gain from interoperability (already dominating their market) and everything to lose, standards are a disaster. The reasons for this is related to market share flow.

Suppose for a moment that the computer market consists only of Corporation A and Corporation B, each having a 50% slice of the market cake. Some arbitrary body, independent of these, sets standard C for, say, rich text documents (that is, documents with more than just plain text; different fonts of different sizes and styles, embedded images, et cetera). Now, both A and B, knowing that C is better than they currently offer, both rush to implement C. After a couple of months, both release their own products for dealing with these C files. Now, it just happens that Corporation A's product is slightly more feature rich than Corporation B's product — so naturally, A get more customers, and a larger slice of the marketing cake. B would then, as a response, improve their product again, and a technological arms race ensues, with the average level of software sophistication steadily rising; quite a good thing.

Now suppose that Corporation B, despite their best efforts, lag behind A considerably, and fail to keep up with the pace. Soon, Corporation A will dominate the market — not surprising, given that their product is superior — suppose a 90% slice of the cake. The market flow was in their favor, due to a simply better product. But, at this point, something happens. Corporation A now has market dominance, and their products are most widely used. Suppose now, that Corporation A, fearing that Corporation B is on the verge of a breakthrough that would tip the scales in their favor again, improves on standard C — let's call their improved format C-XP — but do not release information on exactly how they improved it all. Within months, their new C-XP format is widespread. However, as you might have noticed, it is not interoperable; since A did not divulge the specifics of their format, no one else can implement support for it. Corporation B releases their new breakthrough soon afterwards, but much to their chagrin notice that no one switches back to B, despite their product being superior. What has happened?

After A gained its foothold, it started implementing its own ways of doing things, which were by convention non-standard, simply because they did not divulge any details. By making sure this new format was the widespread one — either by bundling it or making it an unchangeable part of their software, or simply by encouraging it, either being an easy task considering their near-monopoly — they ensured that anyone who was considering switching back to B (because of its attractive new features) would be put off by the costs of switching to a system where all their old files could no longer be used (since, of course, C-XP is incompatible with C). This is called an artificial demand — a demand for A's products caused not by A's products being superior, but rather because of A locking customers into their products if they want to retain compatibility with A's earlier products (which were established by 'legitimate' means).

So as we can see, by sticking to proprietary ways of doing things, A has pretty much ensured that they stay in their monopolistic position. One interesting thing that might be worth pointing out that if A truly had a superior product, it would be better if they released their specifications publicly. This way, other vendors are encouraged to implement this more open standard, and the flow FROM these other vendors to A's product would increase (given that their product is better). Indeed, by not following open standards and instead implementing their own, A can relax their own product quality because no one can move anywhere anyway — an effective customer lock has been established. If the company were not in a monopoly position, this would pretty much be impossible; this method rarely attracts new customers, unless they are also dazzled by the quality of A's products.

Time to snap back to reality. Corporation A, as you might have figured, represents the Microsoft Corporation, who are notorious for exactly this kind of business practice; violating existing standards and enforcing their own instead. It's indeed easy to see how this is very favorable to Microsoft themselves; product lock-in ensures plenty of profit, as long as they keep their software at a fairly acceptable level. Whether this results in a situation profitable to the general public, and the state of computer science as a whole, however, is something else.

Historically it's been easy for Microsoft to employ tactics like this, as the alternatives have rarely been viable; Unix-like systems have been unable to match Microsoft outside academic use. Recently however, with Linux becoming more and more friendly to the average user and more and more powerful, this has become increasingly difficult. Combined with an outcry from the engineering community, it's put Microsoft in a bit of a pinch.

Onto our second example, then: The Open Source community. Unlike Microsoft, the OS crowd (that's Open Source, not Operating System) is unsurprisingly not a corporation, but rather a loosely defined group of people. There are several companies out there that fully support the Open Source concept and employ it regularly as a business practice, making good profit. The very concept of Open Source might also be confusing to the uninitiated, so here's a brief description. Typically, when a company writes a piece of software, they want to keep the source code for it secret, much like an engine designer would want to keep the blueprints to his engine secret and just sell the engines. This is purely financial, of course, and is probably a sane financial move. However, there are people who do not agree that this method — closed source — is the only way to go. Some do not have a profit interest in the product, and others find their profit elsewhere in the producer-consumer chain. Others yet have other motives, but they all have one idea in common: Source code should be open; that is, available to anyone and everyone.

The idea of Open Source has some interesting implications: Anyone can take an existing design and improve on it to create a much better product than before, increasing the technological sophistication in the world, much like he could with the open standards mentioned in our hypothetical financial situation above. Now, you might expect that this would lead to more monopoly-type situations as seen above, and in some unfortunate cases it does. As a whole, however, the Open Source community encourages source to be open and available, and progress is being made in many fields independent of corporations. There's one particular thing in place to explicitly combat situations like this, in fact; The GNU General Public License, or GPL for short. It is a license for source code, drawn up by the Free Software Foundation, that explicitly disallows this sort of behaviour by forcing people deriving new works from GPL source code to license their code under the GPL as well. Put simply, one can say that once you go GPL, you don't go back.

Now as previously mentioned, corporations do make money off Open Source. This might seem baffling from a traditional point of view; surely a company that reveals its innermost secrets to the world cannot be successful? As a matter of fact, they can be. The idea is not to make money off the programs themselves, but on the surrounding business such as hardware setups, upgrades, technical support deals and education/certificates. By releasing the program and its code freely, the company has in fact found an invaluable resource: peer review and improvement. While the company itself may pay people to work on their project, they have also acquired a vast corpus of people who work on the program simply because they want a better program, and by extension this benefits the company who employ the program as part of their business strategy. Some companies even build systems from part proprietary code, but based on a vast, common ground of Open Source programs with much success.

I think that in this last paragraph, we have caught a glimpse of the fundamental philosophical difference I mentioned in the opening abstract. People do work on these projects in their spare time, perhaps for fame and recognition, but more likely because they want to make the program better. If a particular program doesn't offer you everything you need, you can just improve it so that it does do what you need it to do — if you can write good code, that is. Your changes, should you choose to give them back to the community (perhaps as a way of thanking them for providing the program to you to begin with), can then benefit others in the same situation. Through this cycle of improvements, the quality and sophistication of freely available programs is steadily rising at a quick pace, and nothing is kept secret. Standards are followed, and interoperability is everpresent. So what's the difference here?

The main difference is that no one is trying to make any money off of this. At least, not as a primary goal. It seems as if the main goal of all this is to, simply, make better programs. Looking back to corporations such as Microsoft, we see another view; their main objective is not to improve software, but to make money. Indeed, it seems that a large slice of the reason they are in the computer business is that it is highly lucrative. This I believe to be the fundamental difference between large-scale software corporations and the Open Source collectives; one strives primarily towards profit, with software design as a means to achieve profit, while the other strives towards better software, with profit as a means by which to support this evolution (certainly, development is not cheap, and there's nothing inherently wrong with profit). Both of them lead to better software, and both generate profit; that is not to be denied. It is clear to see, though, where the emphasis lies in the different groups.

One might ask oneself why we are here, and what corporations are good for. One view states that the prime objective of corporations is and should be money generation, and indeed this view seems to be the driving force behind many young entrepreneurs. The vision of a future as one of the world's richest men is certainly appealing. Then there's the alternative view that corporations ultimately exist to serve the interests of mankind, to further their respective fields and make a better future for everyone, and that profit is merely a way to sustain this activity.

If it's not already obvious, I might state my personal opinion on this (as if it didn't already saturate this entire article!). The purpose of corporations, and all manner of communities, indeed of our very existence, must inexorably be to further society and make progress. It's a bit sad to see the blatant egoism and chronocentrism of many of today's businesses; that they offer a service merely for the gain of lucre, and improve only reluctantly not to fall behind. Corporations surely have a place in our society — it is irrational to expect to get something for nothing — but I believe that ultimately corporations like Microsoft hinder the progress of the field more than they further it, in the long run.

LINUX: What's New, KDE 3.3 Preview

School's out. Let the fun begin

What's New, KDE 3.3 Preview

Wonderful, Now I've gotten that bad Scooby Doo reference out of the way... let's continue (if you don't get it, consider yourself lucky). Well I did it for KDE 3.2 so I figured I'd do it again :-] Maybe this time the world wont be pissed at me because of that little IE thing }:-]
So anyways KDE 3.3 is kinda close to being released and being the eager beaver that I am I had to go get it. I was so impressed after I got it, I decided to share. Again.

For the faint in bandwidth, I've once again decided not to add any pics directly in here, but the links (if they exist) to corresponding images have been added. And dude, get broadband :p
Percieved Improvements

Those geniuses have managed to speed up the start up time even more again. Well it seems that way at least. Maybe it's because I'm using this special account made just for testing out the new CVS, but seeing that I did that last time as well, I think there are definitely some improvements over 3.2's start-up time.

Ok, let's run through the apps and see what we've got.

KDM looks better. No, it's not because it's gotten an GDM like makeover. Nor is it because MDM (from KDE-Look.org) has been adopted as the new DM. It's simply because we have usable user icons now. Yup, you heard correct boys and girls. I said user icons. Courtesy of some kind soul who saw the need and had the talent KDM users now have a variety of faces to choose from in $KDEDIR/share/apps/kdm/pics/users. As an added bonus there's also a simple way to choose those user icons, but we'll get to that when we chat about kcontrol.
Speak of the Devil! The KDE Control looks exactly as it did in the 3.2 release on the outside, however, in actuality it changed a tad. And for the better methinks.
* The Background and Window Decorations modules got a bit of TLC.
* The Background module got a face lift, while the Window Decorations (well some of them) now have support for the Advanced options kwin got in 3.2.
* Security & Privacy -> KDE Wallet looks like it got some cosmetic touch ups. I wouldn't know really since I avoid kwallet LIKE THE PLAGUE. Damn that thing is annoying.
Note to developers: add the option to not ask for a password EVERY SINGLE TIME it needs to access/store a password.
Note to self: Make a wish.
*+[TAB] Switching
Ok, so this may be stretching it a bit and getting into the vein of ridiculousity; but I thought it was damn cool! Both Alt+[TAB] and Ctrl+[TAB] switching look cooler now. And cooler is better in my book. When I figure out how to take a screen shot of this it shall be done. Two minutes and one ksnapshot opening later... "Nevermind! I got it!"
From that thingy in the Control Center that gave Konqueror mouse gestures! MORE! I think. Is it an app? Is it a plugin? Who knows?! I'm just glad that it's there. Really isn't any need for Firefox now is there }:-] Heeeeere, foxy, foxy, foxy, foxy. I've got a present for you. A "retirement" present...
Konqueror now correctly handles application/xhtml+xml files. Weeeee! It's been fixed! Good news for us web developers (and I use the "us" loosely).
OMFG! This thing is awesome! I could go on and on about it. I could post pics about it. But I'll just let you sit and wait in suspence. Until the release.
Amarok is cool not only because of how well does it's job as a music player but also because of the good documentation that it comes with. I actually figured out why I wasn't getting gstreamer output to work from reading ther help docs. One gst-register-0.8 later amarokin' baby ;) If you haven't already, check it out. It's definitely improved in both looks (slightly) and functionality (IMO greatly since the 3.2 release).

Well that's all I have to say about that. Time for the newness.

Not much to report back here, but what I find I will faithfully reveal.

* There is a new Theme Manager module which I'm not totally sure how to use yet, but it seems to store the settings for your user's overall theme properties. That is it isn't a way to create themes (which is what I was hoping for).
* Breaking news today as the Desktop -> Panels module in the KDE Control Center has made a minor addition to the family. We'd like to welcome little Konqueror Profiles to the Panel Menu family. And in other news...
* Hey look guys! Redundancy! Isn't it fun?
Um. No.
As per the redundancy crack, I'm referring to the Security & Privacy -> Password & User Account and System Administration -> User Account modules. To my untrained eye they seem remarkably similar. I'm still wondering if that's a bug or a feature. Oh well. When in doubt file a bug report }:-]
* Looooooooads more Web Shortcuts added. At least there's a lot more there than the last time I looked since I only ever use gg: You'll find the whole she-bang in Internet & Networking -> Web Browser -> Web Shortcuts.
* New Stuff's also been added to KDE Components -> Component Chooser, the most notable (and only methinks) change being Instant Messenger. Wonder what it does... SINCE THAT'S THE ONLY POSSIBLE OPTION! Oh well, time will tell I guess.
As reported earlier Konqueror has had new profiles added.

Bookmarks has a new friend! Bookmark Tabs as Folder...

And I can find absolutely nothing else to say about that. Seems a tad peppier though.

Oh! Oh! Oh! I just thought of one! You can now click on the empty area on either side of tabs to open up a new tab. It's a pretty neat feature and fit's in perfectly with KDE's feature richness (or what the guys over GNOME who're jealous like to call bloat). But I'll tell you this, I'll take bloat over that spacial nautilus any day.

Double Oh, Oh, Oh! I just figured out how to do find as you type. Just hit the ['] or [/] keys to enable. Wish that were documented somewhere. I had to try to remember what firefox used to get it to work. But hey! It works!

On a more serious and less "ranty" note, I'll like to thank that guy who puts together the KDE CVS Digest, that's where I found out about that feature (I think his name's Dirk, but chances are I'm horribly, horribly wrong). Neato!

Bugs. Bugs. Bugs.

Ok, now it's time to stop singing praises and start some hardcore complaining (known to those in the biz as bug reporting :-]). Remarkably this alpha release seems more stable than i remember them usually being. Usually I can only have konqueror open long enough to submit a bug report before it crashes. It hasn't crashed yet. And usually quanta crashes every other word I type forcing me to use my backup (older) version. Neither of those has happened very frequently. Dare I say this is beta quality work. To be fair (and honest) konqueror hasn't gotten the chance to be put through it's rounds since this computer isn't connected to the net. But the applications I use on a regular basis seem remarkably well-behaved.
As far as bugs (not application crashes now) go I haven't really noticed any since the computer on which I installed 3.3 cvs isn't yet connected to the net. Still waiting on my wireless router from a Newegg One Day Sale :)
Bugs are present but apps aren't crashing. Stability is way better than any alpha I've ever user before and it only looks to get better, which is always a good thing in my book.

While konqueror has managed to stay crash free Quanta and Amarok have not fared as well :-(. Also Kwin has suffered the occasional inexplicable death. You never really fully appreciate the term "Window Manager" until it's crashed and you realize just how much it does. Also artsd has on occasion pulled a kwin, and just died on me killing my amarok induced high, but it's been for the best. It sort of helped me realize just why i couldn't get amarok to pipe sound through gstreamer. I still haven't gotten that to work but now I think I'm going in the direction of a fix. See, crashes can be a good thing.

Well that's all she wrote folks. Um. He. That's all he wrote. Tune in next week when you'll improve the hit count of this page even more!

P.S. Please forgive the absence of screenshots. I meant to get 'em up but work caught up with me. :-(

M$: XP Service Pack 2 to skip older Windows, for now

WinInsider | XP Service Pack 2 to skip older Windows, for now

OT: JDNC Released as Open Source Project (LinuxWorld)

Desktop Java: JDNC Released as Open Source Project (LinuxWorld)

OT: Mac OS X 10.4 (Tiger) Screenshots?

Mac Rumors: Mac OS X 10.4 (Tiger) Screenshots?

Saturday June 26, 2004 04:13 AM EST
Posted by arn

Mac Rumors

Note: This is a Page 2 News Item
With WWDC just days away, the first Mac OS X 10.4 (Tiger) information and screenshots appears to have been leaked. According to unconfirmed sources, Apple will reportedly provide developers with a Mac OS X 10.4 Preview copy at WWDC on Monday. These screenshots provided reportedly come from this upcoming developer preview.

(Of note, the screenshots come from a previously unconfirmed source, and as a result may spark the usual debates of legitimacy -- though on casual inspection, do appear real (Tiger Read Me PDF)

Overall Mac OS X 10.4 is said to not hold any dramatic changes from Mac OS X 10.3.

The preview is labeled "Version 10.4 Pre-release" (About Box) and build number 8A162 (Build).

A new version of Safari (v2.0) is bundled with the release and offers at least two new features. These include support for RSS news feeds as well as a new "Private Browsing" mode. Private Browsing allows users to browse without keeping a history of pages viewed.

System Preferences (Screenshot) has been tweaked yet again, with the addition of an iTunes-like search function on the top right (Search 1) which hilights relevant control panels in real time. As the search gets narrowed (Search 2) so do the control panels that are hilighted.

In addition, Apple has added new security features to their firewall, most significantly a Stealth Mode (Firewall Options) which should "Ensures that any uninvited traffic receives no response -- not even an acknowledgement that your computer exists."

Perhaps the most dramatic change, however, is the inclusion of a new Expose feature called Dashboard (Screenshot). Dashboard appears to be a Gadget/Widget based utility which provides users with a quick access (invoked by user-specified function key) to frequently used tools/applications. The tools available to users in the Tiger build include Address Book, Calculator, Calendar, iTunes, Stickies and World Clock. The tools provided however, are heavily themed with un-Mac OS X-like styles. It's assumed that developers will be able to provide additional "Gadgets".

Confirmation or invalidation of these images should come at WWDC next week.

Thanks to Gary Niger and Ron Delsner of GNAA for providing the information in this article.

PROG: The Pragmatic Programmers Interview [Jun. 24, 2004]

The Pragmatic Programmers Interview by chromatic -- The Pragmatic Programmers, Andy Hunt and Dave Thomas, recently launched their own line of books on pragmatic software development. Since O'Reilly helps to distribute their books, we convinced them to do an interview about self-publishing, the state of the software industry, and how to become better developers.

LINUX: The Linux Killer

Wired 12.07: The Linux Killer

SPAM: Can Competitors Work Together Against Spam?

Can Competitors Work Together Against Spam? - Jun 24, 2004 - CIO Opinion - CIO

Ever since the Federal Trade Commission made clear its conviction that the best answer to spam is one that would come not from the government, but from technology vendors, critics of that school of thought have wondered what that solution might be. Now we know.

This week, four large ISPs—America Online, Yahoo, Microsoft and Earthlink—said they would put aside their long tradition of fighting tooth and nail and support one another’s different versions of anti-spam technology. That technology, in all cases, is sender authentication technology, designed to make sure that e-mail messages really do come from the person whose name is on them. Because most spam, and most phishing, uses a phony return address, a system that could spot and delete mail from bogus addresses could eliminate a great deal of spam.

Sender authentication technology is a great idea, and while few technologists question its promise, some people doubt the promises of long-time competitors who claim that they are hooking up. According to a recent report in The New York Times, Yahoo, whose approach to sender ID is called Domain Keys, agreed to give “limited support” to the different “Sender Policy Framework” approach taken by America Online and Earthlink. And, by the way, Microsoft agreed to support that SPF approach only last month, and has yet to declare its support of Domain Keys. Does that sound like the kind of commitment required for blissful cohabitation?

In this case, as in many cases of what technology vendors call “innovation,” the technology may be the easy part. Collaboration could be the application killer.

Can four enemies find a way to work happily together? Will technology sender ID technology be the spam slayer that we have all been waiting for?

Tell us what you think will work, and enjoy this online discussion, because Sound Off is going away for an extended vacation. Have a great summer.

Sound Off is a weekly column about current IT-related issues. Web Editorial Director Art Jahnke (ajahnke@cio.com) always welcomes feedback.

Most recent responses ...

Spam has been a long debate since email has become mainstream. I was shocked to hear that the United States was moving in the direction of legalizing spam when the rest of the world was considering it illegal. My viewpoint on SPAM is that it is unwanted email (just like you wouldn't want to get junk mail in your mail postal box, you don't want junk email in your inbox). The difference between spam and your postal mail is that your postal mail costs $$$ and become cost prohibitive to the sender.

I am surprised that email servers today do not utilize a simple verification system or even sender ID. Simple solutions like this would resolve many issues of unwanted email and will force senders to send email from a valid email system. I don't believe they will cure all the problems though because spammers will just obtain multiple sender ID's.

The largest problem is free email services like Yahoo. It's a great place for spammers to "blast-out" their email. Email used to be the responsibility of the corporation that people worked in...if you had offensive email, that person could lose their jobs...but with Yahoo...there is no consequence associated to an offensive email. This type of attitude should be changed (same goes for all those ISP's who allow hacking, port scanning and everything else).

OT: Human Firewall gets new owner

Human Firewall gets new owner | CNET News.com

OT: The Death of Style


By Ed Hardy | Editor-in-Chief
Jun 25, 2004

In late 2002, HP released the iPAQ h1910, a Pocket PC with a ground-breaking design. Experts fell over themselves pouring accolades on this model, and over the next year HP released several more handhelds based on the original.

From a handheld standpoint, that year was the best HP has ever had. In the first quarter of 2003, HP had 16 percent of the worldwide handheld market, while by the fourth quarter it had over 25 percent. Much of this dramatic increase can be credited to the popularity of handhelds based on the design that debuted with the h1900 series.

A few weeks ago, the designs for the 2004 iPAQs leaked out. Not a single one of these showed any sign of the h1900 design. To understand why this happened, you have to have some background.
A Trip Back in Time

iPAQ h1910In 2002, designers at the Taiwanese company Compal came up with the design for a cutting-edge Pocket PC. Compal designs and manufactures electronic devices for other companies; it never releases anything under its own name. Therefore, it began looking for a company interested in putting out the handheld it had created.

Rumor has it Compal first went to Dell with this design, but was turned down. So it kept looking.

The next candidate was HP. This company was already well along in the process of creating what would become the iPAQ h2215, but HP knew this wouldn't debut before the middle of 2003. In contrast, Compal's design could come out in time for the holiday shopping season of 2002. HP decided to license the design as a stop-gap measure, until the handheld it really wanted, the h2215, could come out. That's how Compal's design became the iPAQ h1910.

No one was more surprised than HP by the success of the h1910. HP's pet design, the h2200 series, did OK, but never received as much praise as the Compal design.

This is why HP has never been truly committed to the h1900 form factor. Compal managed to create the first Pocket PC that truly deserved to be described as "sexy," but HP itself hadn't developed the design. Instead, it had been put together by an outside company. It isn't at all unusual for companies to prefer designs created internally, rather than ones developed by outsiders, even when the in-house designs don't do as well.
Moving into the Future

According to rumor, HP will release no less than seven new models before the end of this year. It is clear that HP's engineers have designed all of these, as not one of them bears the slightest hint of the h1900.

Those of use who have owned an h1910 or a model based on its design are quick to bemoan this fact. I have both an iPAQ h1940 and an Axim X30. The X30 is superior to the iPAQ in almost every way: the processor is much faster, the Axim has built-in Wi-Fi, and the latest version of Windows Mobile allows me to easily switch the screen between portrait and landscape modes. Despite this, every now and then I find myself turning to the h1940. The only reason I can come up with is the truly elegant design of the iPAQ, while the Axim is boxy and not all that attractive.

It is only human nature to reject things other people have created in favor of ones we ourselves have come up with. That's why I understand why HP's engineers haven't based the upcoming crop of iPAQs on the h1910. But I, and scads of other people, beg them to overcome this. The designs they have come up with are good, but the handhelds based on the h1900 model are some of the best ever created, and HP needs more of them.

I challenge HP's engineers to not let the h1900 form factor die. Instead they must find ways to use this outstanding design in a new generation of iPAQs. I know this won't be easy, but I believe it is necessary if HP wants to keep growing its worldwide market share.
Looks Matter

There are times when I think that convincing Pocket PC makers that looks are important is a lost cause. Market research firms have pointed out that a majority of these handhelds are bought by large companies, not individuals. That means that the purchasing decisions are made by IT managers, not the man on the street, and IT managers value lots of features, not esthetics.

But this doesn't mean that looks don't matter. The challenge is to create handhelds that offer plenty of features, plus outstanding looks. PC Week recently named the iPAQ h4150 as the best PDA available. This is one of the Pocket PCs based on the h1900 and I'm convinced that the reason for this award is this device offered an excellent combination of appearance and function.

IT managers might not put a premium on appearance, but individuals do. Creating the handheld with the most features will get you a certain segment of the market, but an uninspiring design can cost sales to individuals. When someone is standing in an electronics store trying to decide which handheld to buy, many are willing to forgo some features if one of the handhelds is significantly better looking.

Companies have to appeal to both groups, consumers AND the enterprise, in order for a handheld to be truly successful.

The best form factor ever created for a Pocket PC is the one developed for the h1900. HP would be short-sighted to give it up. This company needs to develop more handhelds using this basic design if it wants to keep growing its market share.

LINUX: Paying lip service to open source

NewsForge | Paying lip service to open source

Paying lip service to open source
Date 2004.06.26 7:29
Author roblimo
Topic Linux

Open source has become hip, trendy, dope, and cool. "Based on Open Source Technology" is used by more than one proprietary software company as a marketing boast. Even Microsoft, everybody's favorite symbol of software proprietarism, now boasts about releasing software under an open source license. Obviously, the phrase "Open Source" is now considered a plus when trying to sell software. Will this lead to more open source contributions by companies trying to associate themselves with this "movement" or will it lead to the death of open source as we know it?

I've followed the Open Source Initiative's license-discuss email list for years. Over and over, software developers show up on this list and ask questions that boil down to, "How can I reap the benefits of open source development and still make money selling my software?"

Some companies go the MySQL/QT route and use dual licensing.

Others are scared of allowing variants of their software to proliferate. They believe that would cut into or even kill potential sales. So they keep looking for ways to weasel out of being open source while still getting free development help from all those millions of open source programmers out there who are eager to help them improve their products in exchange for a pat or two on their fuzzy little heads.

Failing that, just jumping up and down and saying, "We're open source! Open source, yesiree!" seems to be a hot software sales tactic these days. Apparently paying lip service to open source means you're good people, worth buying from, at least in some marketers' minds.

Some software products with "Open" in their names are about as proprietary as proprietary can be, though, which tends to confuse potential customers.

I have a better handle than most on this sort of thing, and I'm confused once in a while myself. Trying to sort real open source from the fakes has got to be hard on people who are learning about open source for the first time.

One thing I don't see often in marketing material generated by companies claiming some sort of vague kinship to open source is a list of contributions they've made to various projects. More and more, I find myself using this as a benchmark for whether or not a company is truly "committed to open source."

Some companies -- IBM, HP, Novell, and Sun are leading examples -- put their money where their corporate mouths are. Sure, these companies sell plenty of proprietary software products, and some of their proprietary software has open source underpinnings, but their contributions to the world's pool of open source software have been huge.

OSDL (Open Source Development Labs; where Linus Torvalds works) membership certainly shows support for Linux. Financial or other support for the Open Source Initiative or Free Software Foundation would be another indication that a company is not just using free or open source software, but is a true, contributing member of the community.

I'm not sure what should be done in the future to keep "Open Source" from becoming a marketing buzzphrase instead of reality. But one thing we can do right now is thank (and patronize) companies that really do support open source and free software. Sometimes an ounce of encouragement does more good than a ton of anger.

In that spirit, if you know of a company that actively contributes to open source or free software but hasn't gotten the recognition it deserves for its good works, please post its name here. This might be a small gesture, but it's more gratitude than most positive corporate actions ever get, and I'm sure it will be appreciated.

1. "boasts about releasing software under an open source license" - http://www.microsoft-watch.com/article2/0,4248,1561861,00.asp?kc=MWRSS02129TX1K0000535
2. "Open Source Initiative" - http://opensource.org/
3. "license-discuss email list" - http://www.mail-archive.com/license-discuss@opensource.org/maillist.html
4. "OSDL" - http://www.osdl.org/
5. "membership" - http://www.osdl.org/about_osdl/members/
6. "Free Software Foundation" - http://patron.fsf.org/current-patrons.html

OT: IP Versus IP

IP Versus IP: "By Susan Kuchinskas

SANTA CLARA, CALIF. -- Intellectual property issues are ubiquitous on the Internet. When intellectual property meets Internet protocol, experts say, there are more questions than answers.

Peer-to-peer networks, in which computing devices interact with each other and exchange information independent of a central server, are the most widespread example of how computing is moving to the 'edge' of the network. And P2P is also the poster child for copyright infringement, due to its widespread use in swapping music and video files.

But as all sorts of IP spreads across the Net, it causes confusion about the law -- and lawsuits to figure it out. The entertainment industry's answer, the Inducing Infringement of Copyrights Act of 2004 (ICCA), introduced in the U.S. Senate on Tuesday, could stifle innovation and victimize companies as well as individuals, experts claimed.

Attorneys discussed the issues during Supernova, a conference about how decentralization -- when everything and everyone becomes networked -- is changing business and society.

ICCA is an attempt by the entertainment industry to plug holes in the Digital Millenium Copyright Act, which was written before the rise of P2P, said Sarah Deutsch, associate general counsel for Verizon Communications. When the DMCA law took effect in 1996, it required ISPs to remove material that infringed copyright from servers. 'But as P2P developed, they became unhappy with the deal,' she said.

Verizon fought the Recording Industry Association of America's demands to provide the names and addresses of subscribers who were suspected of illegal file sharing. The case is going to the U.S. Supreme Court. Meanwhile, Deutsch said, the RIAA began filing around 500 John Doe subpoenas a month to identify subscribers who were infringing copyright.

ICCA is the RIAA's next step, but it goes too far, Deutsch said. 'The purpose of the bill was supposedly to go after P2P, but it's an extra blunt instrument.' The law makes companies that build a technology that might induce people to perform illegal activities liable for those activities. She said it could apply to Apple for offering the iPod MP3 player. 'You have a device that can hold 10,000 songs,' she said, 'and they know you're not going to spend $10,0000 on music, so they could bring a charge of inducing. Or Verizon could be charged because they provide broadband, which can be used for illegal downloading. They can go after any product or service they think might lead to copyright infringement down the line.'

Wendy Seltzer, an attorney for the Electronic Frontier Foundation, said Congress' approach to the problems of IP protection on the Internet has been to keep adding another layer of people who can be liable for violating copyright or another layer of protections for copyrighted works without a lot of thought to the unintended consequences.

Under ICCA, she said, 'A new class of people who have been designing technologies suddenly find themselves facing millions of dollars of court costs because someone claims they are inducing infringement by providing the tools for it.'

Meanwhile, she said, Hollywood wants to put out high-definition digital content without the expense of including protections, so it lobbied the FCC to add digital broadcast content protection, or 'broadcast flags,' to digital television signals. Broadcasters can embed signals or 'flags' in programming that prevent redistribution. Television and other device manufacturers must incorporate technology to read the flags.

'So the FCC plans to make HDTV more restrictive than the current television signal,' she said. The law 'lets Hollywood design all of our consumer electronics products.' Seltzer said the broadcast flag requirement keeps entrepreneurs and hobbyists from devising new kinds of personal digital video recorders, and would keep consumers from enjoying options such as digitizing a TV show on the computer. The law takes effect in July 2005. Seltzer's advice was to get your HDTV PVR before then."

LINUX: Scanning for viruses with Knoppix

O'Reilly Network: Scanning for viruses with Knoppix [Jun. 25, 2004]: "Recently, I have had a few machines suffer from weird behavior, and while the machines run virus scanners, some of the users don't have it set to automatically download new definitions. I wanted to make sure that no viruses were hiding in the background or trying to evade detection. This is where Knoppix comes in.

First, with the Knoppix disc, the OS that might possibly be infected is completely powered down, so anything that might have been running in memory is gone. Second I'm booting into a completely different OS, so I don't have to worry about the infection somehow running accidentally under Linux. Third, Knoppix and the virus scanner in it is free, so I can burn many copies of it and scan multiple machines at once.

So, how to scan them? Knoppix does not include the virus scanner as part of its CD by default, but it is an option in the live software installer. So, I run the live software installer from the Knoppix menu, and install f-prot. Once f-prot is installed, a new icon appears on the desktop for your newly installed programs. I run the front-end to f-prot and check the option to download the latest definitions.

Once the definitions are updated, clicking another option will let me choose drives that Knoppix has detected for f-prot to scan. This process does take some time, but hey, Knoppix has web browsers and tons of games to help me pass the time while the scan is finishing. Once it's done, I get a nice long report of each file it scanned and which ones are infected with a virus, then I can decide to go through and delete those manually, or move them somewhere safe, or whatever I want to do. You could also run f-prot from the command line and tell it to attempt to repair or delete the infection itself.

Since Knoppix can share directories over the network with samba, you could also have other virus scanners on known clean machines scan the share if you were really paranoid.

One handy thing about using Knoppix for this, is that you can also go to that relative's/friend's computer that doesn't have any virus protection and seems to always get infected with the latest viruses (you know the one), and you can safely clean the system up.

Kyle Rankin is a systems administrator for The Green Sheet, Inc. and has been using Linux in one form or another since early 1998."

SEC: cat /dev/DiBona/brain: What About E-mail Security?

cat /dev/DiBona/brain: What About E-mail Security?: "Posted on Tuesday, June 22, 2004 by Chris DiBona Printer Friendly Page Send this Article to a Friend

Security Let's start with TLS while we wait for SPF.

Ever used driftnet? I go to a lot of conferences, and one thing I find myself doing while speaking is checking how engaged the audience is by sniffing the 802.11b traffic in the room. Whether this is a measure of how absorbing I am or, to put it delicately, how self-absorbed the conference attendee is, is an open question. That said, it always is surprising to me how little encrypted traffic is flowing on the very open Internet.

Does anyone use Telnet anymore? I hope not. We have better tools now; SSH and SCP are the way most sane people hop around systems on which they might have accounts. In addition, the days of unreliable and patent-encumbered SSH are gone, and we now have the luxury of using OpenSSH for shelling around.

For Web browsing, no one in his or her right mind would transmit important information over an insecure link. Secure sites for bank, shopping and other applications are the norm, with poorly secured sites being exceptions.

One mode of communicating over the Internet, however, is done shockingly, overwhelmingly in the clear. I'm talking about e-mail. It doesn't have to be this way, though. With all the excitement about SPF and other mail authentication methods, it struck me that people haven't done even the base work to secure their communications--myself included. If one can eavesdrop on communications, one can spoof those communications. In my mind, authentication schemes are going to be of dubious value without an attendant amount of effort spent securing the line as well.

Hence this article. TLS is a great way to do the basics of securing your e-mail against this very basic kind of snooping. You can use the same certificates you use to secure your Web site. Many people find using unsigned certificates works okay too, although it offers no guarantees for security or reliability regarding the exchange of mail.

TLS is an acronym for Transport Layer Security, and it simply encrypts your mail server to mail server traffic with other TLS-enabled sites. I recently implemented this for a client site and thought it really was in my interest to implement it for my personal e-mail, as well. So I decided to implement it on DiBona.com.

I made a quick trip to FreeSSL.com to pick up a certificate. Thirty-nice dollars and an automated phone call later, I had it. I highly recommend FreeSSL. Clear instructions on how to create the certificate on any competently setup Linux box, straightforward processes and a smart phone system make for a great way to get a solid certificate without any worries. FreeSSL has terrific phone support as well, in the event you make a mistake or need a little bit more hand holding, as I did for a chained certificate some months ago.

I ran this article by Uriah Welcome and Marc Merlin. Marc reminded me that you don't have to get a signed certificate, as most mail systems are configured to not care so much about the chain of trust. If you think about it, when your chain of trust ends at network solutions, it's not so trustworthy, but let's not go there. Uriah wanted me to make sure that people don't forget to protect the certificate files properly, don't chmod 666 then and so on.

Setting up TLS with your mail server can be difficult, depending on your setup. An exim installation is too simple for words. Once you have generated the key, you simply need to add the following three lines to the top of you exim.conf file:

tls_certificate = /pathtocert/dibona.crt
tls_privatekey = /pathtokey /dibona.key
tls_advertise_hosts = *

Please note that I used the exim-exiscan rpm.

For Sendmail users, here is an oldish Linux Journal article on the topic; I haven't installed it yet for Sendmail, so I can't vouch. Same with this postfix install. I expect users reading this install have better links than these, so check the comments for more on those servers.

A cool tool to use to test if you have set up TLS correctly has been provided by the Kernel.org people. To see what a goodish configuration looks like, see here.

Note the 250 Starttls line. That means I'm ostensibly ready to trade encrypted e-mail with other hosts. Life isn't perfect yet--I need to implement SPF--but I feel much more secure in my e-mail.

Chris DiBona is the Co-Founder of Damage Studios, a San Francisco-based game company working on the next generation massively multiplayer on-line game, Rekonstruction. He was formerly an editor for Slashdot.org and was the co-editor of Open Sources: Voices From the Open Source Software Revolution."

HOW: HOW-TO Tuesday: War Kayaking

HOW-TO Tuesday: War Kayaking - Features - Engadget - features.engadget.com

OT: What Would (Steve) Jobs Do? We have a winner!

What Would (Steve) Jobs Do? We have a winner! - Engadget - www.engadget.com

Posted Jun 24, 2004, 6:24 PM ET by Peter Rojas
Related entries: Announcements

First off, thanks to everyone who entered our ”What Would (Steve) Jobs Do?” contest, where we asked readers to send in their best and most interesting guesses for what big product Apple might announce at next week’s Worldwide Developers Conference. We received literally hundreds of entries, and we were blown away by the detail many of you went into.

As expected, a lot of the guesses were either for a wireless iPod, a video iPod (or one with both video and wireless), or some sort of wireless dock for the iPod. Several people guessed that it’d be a G5 PowerBook, or an Apple tablet Mac, while a few guessed that it’d be nothing, which all things considered, is a fair guess.

It took a while, but our panel of judges, senior editor Eric Lin, columnist Phillip Torrone, editor-in-chief Peter Rojas, and Unofficial Apple Weblog editor Sean Bonner poured over each and every entry, and after much debate, we finally settled on a winner.

We tried to judge entries on a combination of creativity and likelihood, so it didn’t just have to be something Apple could conceivably do, we also wanted it to have that little extra spark of cleverness that makes people love them.

Anyway, enough blather, you wanna know who won, right? We ended up picking Phillip Burgess’ concept for a combination infrared remote and LCD display for the AirPort Express. There were several entries proposing a WiFi remote or that the iPod (or some sort of wireless tablet) pull double duty as a remote control for the AirPort Express, but Phillip’s concept was well-thought out and had the right balance of creativity and plausibility that we were looking for. It wasn’t an easy decision, there were plenty of great entries for gadgets we would really love to see Apple do (but know they won’t), but we felt like whatever we picked had to be something that could actually be announced on Monday.

Here is Phillip’s entry in its entirety, complete with a sketch he attached:


Seems like everyone’s whining about the “AirTunes” feature of AirPort Express and having to control everything from one’s computer, which might not be anywhere near one’s audio setup. The solution seems pretty obvious…in fact, I’d wager money that the AirPort Express firmware will already have this capability built in (but undocumented) when it ships. If Apple doesn’t deliver something like this by Christmas, Keyspan or Dr. Bott will (though perhaps without the LCD - just an IR receiver).

Why an infrared remote rather than Bluetooth or 802.11? Battery life, and compatibility with existing universal remotes (think “convergence,” not “computer peripheral”). The remote might look something like a flattened iPod Mini, running off a common CR2032 battery. The base unit might also work directly connected to a Mac or PC. MSRP would be perhaps $99 (AirPort Express not included).

Remember that it’s entirely possible that one of the guesses we didn’t pick will turn out to be exactly what Apple announces! There were a lot of you out there who think they’re definitely gonna come out with a wireless iPod, and we can’t say we disagree.

SEC: How to use cryptography in computer security

IT Manager's Journal | How to use cryptography in computer security

Title How to use cryptography in computer security
Date 2004.06.24 12:01
Author editingwhiz
Topic Security

Cryptography is the mathematics underlying computer security. While a Ph.D. in cryptography is hardly a requirement for keeping one's systems secure, an understanding of the basics helps in decision making about security, both for system administrators and IT managers. In this article, I present a non-technical overview of a few key concepts in cryptography that are relevant to consumers of security solutions. I then look at some widespread myths about cryptography, and give some advice on practical matters relating to cryptography.

Cryptography basics

Encryption. Suppose your company has a confidential database which you wish to backup offsite with a storage provider. This is a typical application of encryption. An encryption algorithm, or a cipher, takes the data and a small key (think of it as a phassphrase) and produces garbled data. For someone who doesn't have the key, there's no way to make sense of the garbled data (more precisely, doing so would take an insane amount of time), but if you have the key, you can easily get the original data back. So you retain the key and send the garbled data to the backup provider.

Encryption can also be used if you want to communicate with your business partner across the world and you don't want anyone to eavesdrop. In this case, you'd both have to have a shared key which no one else has. Setting up this key can be a problem, which I will explain a bit later.

Message digest. You also want to make sure the backup provider gives you the data back correctly if your copy is lost. For this, you'd use a message digest algorithm -- which takes the data, and no key -- and produces a short digest. The digest gives nothing away about the data. At the same time, it ensures that any two different pieces of data are extremely unlikely to have the same digest. If you compute and retain the digest at the time of backup, you can verify whether the data is uncorrupted, even though your copy of the data has been destroyed.

A message digest can also be used by a Web site that wishes to have its data mirrored but does not trust the mirroring site. It can put up the message digest on its Web site, so that the user can verify that the data they downloaded is not corrupted by computing the digest of the downloaded data.

Yet another -- and very common -- use of message digests is in storing and communicating passwords. On modern Unix systems, passwords are not stored on disk, so that even if the system is compromised, the attacker cannot get the passwords. Instead, only password digests are stored. When a user supplies her password, verification is done by computing its digest and comparing it to the stored value. Many Web sites adopt a similar approach to avoid transmitting cleartext passwords over the Internet.

Key agreement. Key agreement solves the problem of setting up a shared secret key with your partner. Since the whole point of encrypting communications is that someone could be eavesdropping, you don't want to send a secret key through the same medium. A key agreement protocol can be used to get hold of a shared secret key, provided that both parties are sure they are talking to the right person and that no one can tamper with the messages they send each other.

Key agreement is based on assymmetric encryption, also called public key encryption. How it works is that when a business relationship is first established, you look up your partner's public key from a trusted source. Your partner has the corresponding private key, and no one else. Then you use the public key to encrypt the shared key, with the assurance that an eavesdropper cannot read it. Your partner recovers the shared key using their private key, and you both use the shared key to encrypt the rest of the data using a regular encryption algorithm.

You could have used an assymmetric encryption algorithm for encrypting the whole data, but it would be hundreds of times slower.

Digital signature. A digitally signed message carries with it an assurance of integrity and authenticity. After you encrypt the message that you're going to send to your partner using their public key, you sign it using your private key. Your partner gets your public key from the same trusted source and verifies that the signature is correct. They then know that no one except you could have sent the message, and that it was not modified during transmission.

Going back to the data backup example, if both parties digitally sign a document recording the original transaction, including the digest of the data, then you have a firm basis to sue the storage provider in case the data you get back is corrupted -- they can't deny having signed the document. It also protects the storage provider from being sued, as long as they do produce the data accurately when you want it.

Public key infrastructure. A public key infrastructure is used to solve the trusted source problem in the above example. Among other things, it issues a certificate connecting an individual's or organization's public key with their identity. Further, the certificate is digitally signed by the certifying authority.

A directory service is a public key infrastructure that integrates various information about the entities and resources with the certificate access mechanism.

Cryptographic standards. As with any other computer protocol, cryptographic algorithms and protocols need to be standardized. DES (Data Encryption Standard) used to be the standard for symmetric encryption; it has been replaced by AES (Advanced Encryption Standard) because the key size of DES was too small. MD5 (Message Digest) and SHA-1 (Secure Hash Algorithm) are the two popular standards for message digests. RSA (Rivest-Shamir-Adleman) is a well known assymmetric cryptosystem which is used for both key exchange and digital signatures. DSA (Digital Signature Algorithm) is another standard for signatures, and Diffie-Hellman is a standard for key exchange. X.509 is an example of a standard certificate format, and LDAP (Lightweight Directory Access Protocol) is the most popular directory standard.
Myths about cryptography

Myth 1: Increased key length gives increased security.

To see why such a blanket statement is not true, we need to understand that there are two factors that are necessary for a cipher (whether symmetric or assymmetric) to be secure:

* The algorithm must not have weaknesses.
* The key must be sufficiently long.

It can never be proven that the algorithm doesn't have weaknesses. But if it is a well-known cipher which the cryptographic community has analyzed and no weaknesses have been found, then one can place a high degree of confidence in its security.

Key length depends on the type of cipher. Since the math of symmetric and assymmetric ciphers is different, so are the key lengths required for security. For symmetric ciphers, 128 bit keys are thought to be good enough for the forseeable future, and for assymmetric ciphers it is 1024 bits. For long-term security, 2048 bits might be advisable. The difference between the key length of the two types of ciphers is one of the most frequently encountered points of confusion among non-cryptographers.

The amount of time needed to break a cipher depends on the key length. This dependence is described by an exponential functions, which are weird things. You don't often encounter them in real life. Basically, it means that if you increase the size of the key by a single bit, the time required to break the cipher doubles. Thus, it is fairly easy to make your key long enough that it would take more years to break the cipher than there are atoms in the universe.

All this holds only if there are no mathematical weaknesses in the algorithm. If there are, then the whole thing breaks down, the cipher becomes trivially breakable, and key length is largely irrelevant. Unfortunately, calculations of how long it would take to break the security of a security product based on the key length alone are frequently used as a smokescreen to deflect objective thinking about its security. This is a trap against which the consumer must always be on the guard.

Myth 2: Assymmetric key ciphers are more secure than symmetric key ciphers.

While it is true that the discovery of assymmetric cryptosystems heralded a new era in cryptography, the reason is not that such ciphers are more secure, but that they are applicable to problems such as key agreement and digital signatures, for which conventional ciphers do not work.

Perhaps another reason why assymmetric key ciphers are often mistakenly considered more secure is that their hardness is typically based on an easy-to-understand mathematical problem, such as factoring a large number. On the other hand, symmetric ciphers have no particular mathematical elegance. While it is easy to get an intuition for why factoring is hard, this does not mean that it is more secure; ultimately, the guarantee of security boils down to how long and how hard the cryptographic community has been trying to find weaknesses, which is roughly the same for both kinds of cryptosystems. As long as the key length is chosen appropriately, the two should give commensurate security.

Myth 3: Secrecy is important for security.

The prevalence of this myth may be attributed to the historical confusion between keeping your data secret and keeping your security algorithms themselves secret. On the contrary, the only worthwhile insurance of security comes from having your algorithm published and well analyzed by as many cryptographers as possible. The principle that security should not rely on algorithms being secret has been well-established for over a century, and various pithy restatements of it are often cited: "Security should reside only in the key" (Kerckhoff), "The enemy knows the system" (Shannon), and "Anyone can design a cryptosystem which he himself cannot break" (Schneier).

The situation is somewhat analogous to the higher security of open source software, but there are differences which make the case for open cryptosystems far more clear cut: firstly, the techniques of cryptanalysis are such that the benefit in keeping your cipher secret is small. Secondly, even if you keep your system secret, there's a very good chance that it will evenually fall into the enemy's hands -- not the least because many attacks involve insider co-operation. Thirdly, there exist techniques to write provably secure code, but that's not possible with ciphers.

Myth 4: The government can crack any cipher.

It is true that until recently (certainly until World War II, and arguably until the 1960s), cryptography was the exclusive domain of secretive governments. The NSA knew about a technique called "differential cryptanalysis," which influenced design of DES in the early 1970s, something that wouldn't be discovered in public until 1990. In the last 30 years, however, the situation has changed dramatically, and the number of cryptographers and cryptanalysts has exploded. Governments can no longer match the scale of public cryptography. The point is worth repeating that the only determinant of the security of a cipher is the amount of effort that is gone into attempting to break it. As Shamir (the 'S' in RSA) said: "Once the encryption genie was out of the bottle, there was no way of putting it back." The recent U.S. government relaxation of export controls on cryptography shows their acceptance of this reality. Thus, one can use a standard cipher such as AES with a reasonable degree of confidence of security, even against governments.

So, how do you apply knowledge of cryptography in practice? Cryptography is typically bypassed, not penetrated. What this means is that designing a secure cipher is the easy part. The hard part is to make sure that the algorithms you are deploying are implemented correctly and exactly match the security requirements that you have. This is why it is important for administrators to learn the basics of cryptography. It cannot be left to mathematicians; it has to be deployed by someone with an intimate knowledge of the overall IT system of the organization.

It is important to have a threat model and a well-defined and well-enforced security policy. The threat model answers such questions as "What are the sensitive data sets in the system?" and "Who are the untrusted people on the network?" The security policy addresses issues such as "How often must passwords be changed?" and "Are employees allowed to plug in their personal laptops to the company network?" Adding another dimension of complexity is the fact that security is a process, not a product. As the IT environment continually changes, the security policy, too, must adapt constantly. Frequent security audits and reviews are essential.

Finally, there is no such thing as perfect security. The more the expenditure, the greater the assurance of security that can be achieved. However, after a point increasing security comes at the expense of deceasing usability of the system. The security budget and security policy must therefore be finely balanced, taking into consideration various factors such as the overall IT budget, the sensitivity of the data, the experience of the users, and so on.

Arvind Narayanan is a programmer and free-lance writer based in Madras, India. He is the author of gretools, a vocabulary building tool for GNOME, and gtkboard, a board games system.

1. "gretools" - http://theory.cs.iitm.ernet.in/~arvindn/gretools/
2. "gtkboard" - http://gtkboard.sourceforge.net/

OT: Making RSS Pretty

Making RSS Pretty

LINUX: What KDE and GNOME Really Need

What KDE and GNOME Really Need :: osViews | osOpinion :: Tech Opinions for the People, by the People: "C"

Contributed by: Tim R.
:: Open Content

"If KDE and GNOME work together on a number of desktop features, we could finally have a unified desktop environment based on an ideal set of specifications. Tim R. submitted the following editorial to osOpinion/osViews which lays out a plan for the two open source GUIs to work together while still allowing each of them to continue developing upon their unique strengths."

The flame wars between KDE and GNOME may never end. Of course, that's not necessarily a bad thing. The competition between the two will hopefully cause them both to work harder.

KDE and GNOME need to recognize that users will use whatever they like best, and shouldn't think of programs that are part of their desktop environment as something old and obsolete, that will eventually disappear. This just simply isn't the case.

They need to realize that many new applications will not be using their libraries. They need to see themselves not as the desktop environment for their users, nor as the majority of the programs their users will use. Rather, they need to look outside themselves and see the big picture.

They need to develop standards outside themselves and their own little world -- however big it may be.

Something New

I propose a new desktop environment. I call it the "X Desktop Environment" -- hopefully no one's using that name already =).

This isn't something to replace KDE or GNOME. This is something KDE and GNOME should be rebuilt on top of. This is more of a standard than an actual desktop environment, and needs an upper layer such as KDE or GNOME to be complete. Users already have the libraries it employs since they are the same ones that come with XFree86. Part of what I'm proposing already exists, so in some cases I'm just suggesting extending existing things.

Essentially, I'm suggesting replacing the foundation upon which we already have a skyscraper built, so this could get a little messy. Messy or not, it needs to be done.

Wish Lists and Proposals

Here is a list, more or less, of the features I'm proposing:

XDND: A standardized way to drag and drop between X applications. Yes, I know this already exists, but I'll mention it first anyway, since it seems like a good example of how things need to be done.

File Association: One system-wide database of file types, and optionally one in each user's home directory. It contains the default thing to do when the user tries to open these types of files, along with other options, which can be presented in things such as right click menus. In addition, it also says what icon to use and contains anything else the KDE and GNOME people agree should go there, and is extendible.

A special library can be made to access this database (config file), or the GTK+ and QT and query it directly. This also provides a standard place for commercial programs to register their new file types.

"Start" Menu: A standard menu and menu config file/directory structure needs to be made. KDE and GNOME can still use their own program to display this menu, and even still keep their old menus. They just need to come up with a standard format and filesystem location for this new standard menu, and include it in their current menus.

A "Desktop" Directory: Both KDE and GNOME like to take some directory, and display it on the desktop in a GUI fashion, similar to Mac OS and Windows. This will give other programs a place to put their icon. It should also let users switch between KDE and GNOME -- and anything else that uses this style desktop concept.

The Panel: Programs should be able to add themselves to a panel, if one is running, without worrying about which one is running. This doesn't force a panel upon anybody. It can simply return an error to the program if no panel is running.

The Soundcard Sharing Daemon: Since most sound cards only let one program talk to it at once, we have things like ESounD. For the soundcard sharing daemon to work well, they all need to use the same one. GNOME and KDE don't each have their own version of LPD or cupsd (programs used to share the printer), and they shouldn't each have their own version of this either. This is one of those things where if they don't share, it won't work.

Theming: One universal X client theming system. Since a theme usually means the look and feel of a program, there are many different aspects of theming. It will probably be more than one config file or group of files. One will be for the type of things you want to download from themes.org. The other for things that are better left customized by each user.

They should be extendible, and programs that use them can pick and choose which ones they actually use, and which ones are irrelevant to them.

A Better Place

If the things I've suggested get implemented, I believe it will make the Linux desktop a better place. My suggestions improve both standardization and diversity.

They provide a more standard look and feel to the Linux desktop, while making it easier, more customizable, and just as diverse. ::

OT: AOL employee arrested and charged with stealing list

AOL employee arrested and charged with stealing list - Jun. 23, 2004: "Jason Smathers is charged with stealing his employer's subscriber list and selling it.
June 23, 2004: 5:37 PM EDT

NEW YORK (CNN/Money) - A software engineer at America Online was arrested Wednesday and charged with stealing AOL's subscriber list and selling it to someone sending spam e-mail, federal prosecutors in New York said.

According to the criminal complaint, Jason Smathers of Harpers Ferry, W. Va., used his inside knowledge of AOL's computer system to steal a list of 92 million AOL customer account 'screen names,' and then sold them to Sean Dunaway, who is not an AOL employee.

Dunaway, of Las Vegas, NV., was accused of using the list to promote his own Internet gambling business and also sold the list to other spammers for $52,000, according to David Kelley, the U.S. attorney for the Southern District of New York.

The complaint further charges that Dunaway later paid Smathers $100,000 for an updated version of AOL's customer list.

Megan Gaffney, a spokeswoman with the U.S. Attorney's Office, Southern District of New York, confirmed that both men appeared in court Wednesday.

Dunaway, 21, attended federal court in Las Vegas, and Smathers, 24, made his initial federal court appearance in Virginia. Smathers will be detained overnight until a detention hearing tomorrow, Gaffney said.

Defense lawyers could not be reached for comment.

The two men each face a maximum sentence of five years in prison and a fine of $250,000, according to prosecutors.

An AOL spokesman said Smathers has been fired from the company.

'We deeply regret what has taken place and are thoroughly reviewing and strengthening our internal procedures as a result of this investigation and arrest,' AOL said in a statement.

According to the complaint, Smathers used another employee's ID in April and May 2003 to assemble a complete list of AOL's customer account screen names, zip codes, telephone numbers and credit card types.

'(But) AOL has uncovered no information indicating that this theft involved member credit card or password information stored by AOL,' said the statement from AOL.

AOL, the world's biggest Internet service provider, is owned by Time Warner Inc., as is CNN/Money. Top of page"

LINUX: Free software tool automates Mandrake Linux

Free software tool automates Mandrake Linux: "Jun. 25, 2004

Nexedi.org has published an article introducing 'umibuilder,' an interesting free software tool that can be used to update Linux packages. The tool is perhaps most useful for automating security updates of systems using Mandrake Linux. Because it downloads the latest versions of RPMs from the Internet, Umibuilder can also be used to automate security updates of Mandrake systems, according to the article.

Earlier this month, Nexedi and Mandrakesoft announced a strategy to deliver a Linux desktop with zero transition costs. With Rentalinux Desktop Linux Server (DLS) the companies are providing a solution to deploy Mandrakelinux free of any upfront investment, and with minimal disruption. According to Nexedi, rentalinux requires no software installation and no changes to existing hardware or networks.

DLS combines server hardware rental, software setup, custom configuration, support, and maintenance service in a single package.

The umibuilder software tools also benefit developers and 'allows rapid creation of [shrink] wrapped GNU/Linux solutions which can then be distributed on flash memory or on live CD.'

Umibuilder favors Mandrake Linux because it uses urpmi and other Mandrake tools, according to the article. However, users of other distros can set up a chroot environment in which to experiment with Umibuilder.

Umibuilder is configured by creating simple text files that specify the RPM packages which the target embedded system should contain, and the unneeded files from those RPMs that should be deleted.

Another text file, called an 'Umigumi,' describes the hardware target in terms of what modules should be loaded at boot time, and other considerations. Umibuilder users share Umigumis for a range of hardware devices, including the OpenBrick and the Sumicom barebones machines.

Once the needed text files have been set up, Umibuilder automates the process of downloading all the required RPMs, stripping away unneeded files, and creating a CompactFlash or live CD image that will boot on a target hardware.

For more details, read the full article at Nexedi.org.

OT: Editor's Note: RIP, IE

Linux Today - Editor's Note: RIP, IE

By Brian Proffitt
Managing Editor

Before you start reading, fire up the printer, and get the scissors. You may want to clip this one out and give it to your friends and colleagues who are still in Windows land.

There are times in life when you actually hear words coming out of your mouth and even as they're coming out, you realize how stupid they sound. I realize that in my own personal and professional life, this sort of thing happens a bit more than the statistical average, but this morning I uttered words that sounded so completely insane, I had to share them.

After getting up early and scoping out the Net for new and interesting stories to post, I ran across several articles detailing a new form of malware that supposedly hides in Web site graphics, and will download a package to a computer running IE, without the user even knowing it. No one is sure what this package will do; it could be spyware doing keystroke logging, or could be a way to turn an infected computer into an unwitting spam generator. Time, unfortunately, will tell.

Now, after reading this, I was not terribly concerned, since the one Windows machine in the house runs Netscape, and this lovely new piece of malware affacts only those unfortunate running Internet Explorer. But, when my wife came in to say goodbye before she went to work, I said this to her:

"If you surf at work today, you may want to rethink it. There's a new virus hiding out in images out on the Web."

"On which sites?," my intelligent spouse asked.

"They don't know yet, or they're not saying," her not-so-intelligent husband replied.

And as we were having this exchange, I realized that this tiny little conversation had to be the most insane thing I said or will say today. It boiled down to: there's a virus out there that will hit your IE-running computers and you won't know where or when it hits.

Now, to be fair, later today I learned that this immediate threat had been thwarted, because they managed to shut down the Russian server all this malware was sending information to. The malware is still out there, still infecting IE-running PCs, except now it's effectively rendered toothless. Not by a patch or a fix from Microsoft, understand.

And, after all of this, that's when it dawned on me: Internet Explorer must die.

Not be fixed. Not be patched. Be dead, as in no one in their right mind should use it anymore.

This is a piece of software--a closed source, and therefore supposedly (ha!) more secure piece of software, mind you--that is constantly having innumerable flaws exposed and taken advantage of. In the recent past, it was download this, and you're doomed. Open this, and you're in trouble.

Now, it's: open any page on a Web site running a Microsoft Internet Information Server, and you potentially could be infected.

Read this again: By opening a page. With pictures.

I say that this sort of irreponsibility must be stopped and stopped now. The public must be made aware that while Microsoft is certainly not responsible for the behavior of crackers behaving the way they do, they are certainly responsible for creating such a fertile field for them to play in.

So, to that end, I want you to give this article to a friend or colleague and have them read this passage:

"The receiver of this article will be granted the services by the giver of this article to install a non-IE based browser on their computer, free of charge, for the receiver to try. The person providing this service will install the browser on any operating system you have, and promises not to tease you if you are using Windows. The receiver of this service will agree to give the new browser an honest try as their default browser and see what they think."

Now, if you give this article to someone, then you should be prepared to follow up on this clause. Install Mozilla or Firefox for your friend. Install Netscape. Heck, install Opera if they really hate the whole idea of open source. Just get then to try something else, besides IE. Be nice about it, and helpful. Make sure their bookmarks and home pages are set just so. And don't hassle them if they're still using Windows. It all has to be done one step at a time.

If they ask, indicate that while Mozilla and other browsers have flaws too, there are no where near as many critical issues, because Mozilla and the rest, unlike IE, are not intrically tied to the operating system and therefore flaws are not as likely to bring about the complete ownership of their systems by some mook.

I think this will be an excellent way to demonstrate that (1) open source software is not primitive, cobbled-together code and (2) IE is not the be-all end-all of browser technology.

After they try it, and like it, you can use a similar technique for other cross-platform OSS, such as OpenOffice.org. Once they're comfortable with that, then you can waddle out the penguin.

This is my ultimate migration plan. Nothing fancy-schmancy. No usability studies. Just kill off IE first to save us all from zombified computers and massive worm traffic, then work on the other stuff.

Because we can all talk a good argument up for open source, but a lot of folks still need to take it for a spin to really understand. So let's rev up the test drives

PROG: Reduce compile time with distcc

Reduce compile time with distcc

REVIEW: PlanMaker for Linux

NewsForge | Review: PlanMaker for Linux

Title Review: PlanMaker for Linux
Date 2004.06.25 2:01
Author ValourX
Topic Software

Spreadsheet development has more or less solidified over the past year -- the majority of the features that most people need are already there in long-established proprietary programs like Excel and Lotus 123 -- and that means that the door is open for smaller companies and free software projects to grab market share with capable, inexpensive products. That's probably the best way to describe SoftMaker's $49.95 PlanMaker 2004 for Linux, which was released earlier this month: capable, inexpensive, cross-platform competition for Microsoft Excel 2003. You won't find a more Excel-compatible spreadsheet on any operating system, but Microsoft compatibility is far from PlanMaker's only worthwhile feature.

PlanMaker is both smaller in size (29MB) and faster to start than most of the other spreadsheet applications I've used. It's considerably faster than OpenOffice.org Calc and about as fast as the GNOME Foundation's Gnumeric when starting up. It's also lightning fast when it comes to opening a worksheet, even when there is a lot of data to load and formulas to calculate, or embedded charts in it. One particularly data-heavy test case with a single large chart took only 10 seconds to load, while the same worksheet took more than a minute to load in StarOffice 7 Calc.

Calculation speed is equivalent to Excel or Gnumeric -- fast enough that you'll hardly think about it. It takes a lot longer to load the data and scroll through a large worksheet to get to an empty cell than it does to perform a numeric calculation.

Like all spreadsheet programs, PlanMaker has a row limit -- specifically, 16,384 rows, which is below most of PlanMaker's competitors, such as Excel (65,536 max rows) and OpenOffice.org Calc (32,000 rows). The maximum number of columns for most spreadsheets stands at 256, and PlanMaker is no exception. If you try to import a worksheet that has more than 16,384 rows, PlanMaker will truncate it, ignoring all data after that row. According to SoftMaker, the program is capable of handling more rows, but the code is not yet optimized to get ideal performance with larger worksheets. Future versions are likely to include support for 65,536 rows.

PlanMaker's interface uses an independent toolkit, so it's not controlled by GTK, Qt, or Motif. It has a nice Excel 2003 gradient look to it when using the default settings, and you can change it to look more like older Windows, OS/2, or Pocket PC styles. The menus are not identical to Excel's or any other program's, but they are by no means difficult to navigate if you're accustomed to using a spreadsheet application.

PlanMaker's charting and graphing capabilities are quite advanced; both 2D and 3D chart types are provided, the most prominent being the surface chart. You can also set the color palette of the charts to specific schemes, including the same color palette used by Microsoft Excel. In other words if you really don't want someone to know that you're not using Excel to generate your charts, you can use Excel's default color scheme to make the output of the two programs identical.

Compatibility with other programs is where PlanMaker really shines. It can read from and write to Excel 2003 worksheets even if they employ various Excel-only features. SoftMaker has a page detailing its Excel compatibility features. While you have to take a manufacturer's propaganda with a grain of salt, my own analysis of the test cases shows them to be completely accurate and as-advertised. What they don't show, of course, is the fact that any Excel sheet with any kind of macro in it will not have full functionality because of the lack of macro support in PlanMaker for Linux. Even though it can't make use of them, embedded macros are perfectly preserved in an Excel worksheet when you edit it with PlanMaker. In writing to the Excel format from PlanMaker I was unable to find any flaws in translation when reading the same worksheets in Excel 2003.

A 3D chart in PlanMaker

There are, unfortunately, no analytical or statistical functions built into PlanMaker (for operations like covariance or forecasting) although there are certainly a large number of calculation functions. You also won't find any support for macros in the GNU/Linux or pocket/handheld PC editions. The Windows edition does have a certain amount of programmability through a bring-your-own-language method. By importing PlanMaker's type library into Visual Basic or Borland Delphi, you can program macros and scripts in an interface similar to that of Microsoft Excel. SoftMaker does have a VBA-compatible macro language called BasicMaker, but currently it is only supported in the Windows version of the SoftMaker office suite in the German language. Translation of BasicMaker into English is underway and will be available in future editions of PlanMaker for Windows; the GNU/Linux edition will also have macro support eventually.

The internationalization support is poor as of this writing; the program is available only in English, but it does have spelling dictionaries available in 17 different languages and automatic hyphenation for several more. Internationalized versions of PlanMaker are now in development for Spanish and Portuguese, and support for further languages will be available in the future, according to SoftMaker.

English documentation is nonexistent, but SoftMaker does provide an enormous, comprehensive, 400-page PDF manual in German, which is currently being translated into English.

Installation and licensing

There's no fancy installation program for PlanMaker for Linux -- you simply unpack a .tgz file and move the resulting directory to your preferred location. Your user home directory, /opt, or /usr/local are probably the best choices to store the program files.

Upon running the PlanMaker binary for the first time, you're asked for your email address and the product key, which is emailed to you when you purchase the software. While this step is required to install the program, there is no "phone home" activation scheme as in a number of other proprietary applications on the market today.

The license is of course proprietary -- this is not free software. You're allowed to install the software on multiple systems as long as PlanMaker is not concurrently used. You can also use it over a network as long as each client machine that accesses PlanMaker has its own license.


PlanMaker is a good value, considering its price and features. It's not the kind of heavily armed force that Excel is, but if you're one of the large percentage of users who uses a small percentage of the features in Excel, PlanMaker is an excellent low-cost alternative. If you need the best possible Excel compatibility in a GNU/Linux-based spreadsheet, PlanMaker is your best bet (outside of a virtual machine running Windows and Excel).

The Windows version and the GNU/Linux version are the same in most respects, the only exceptions being platform-dependent options with regard to fonts and memory usage, and the programmability through VB or Delphi. You can create a worksheet with PlanMaker for Linux and edit and use that same worksheet in the same program on a handheld or pocket PC or in Windows. There is no other spreadsheet program that works so effectively on as many platforms.

PlanMaker is limited in several ways, however. It doesn't have the internationalization support that Gnumeric has, or the macro capabilities that most other spreadsheets have, or the large worksheet support that high-end spreadsheet applications have. PlanMaker is a lot like its word processing partner TextMaker in that it provides the functionality that most people will need while providing superior compatibility with Microsoft file formats at a low price.
Purpose Spreadsheet
Manufacturer SoftMaker
Operating systems GNU/Linux, Windows, handheld PC, Pocket PC
License Proprietary
Market Cross-platform business users, GNU/Linux and Pocket/handheld PC users
Price (retail) US$49.95 or €49.95 (includes VAT), but package deals and discounts for return customers are available
Previous version N/A
Product Web site Click here

Jem Matzan is the author of three books, a freelance journalist and the editor-in-chief of The Jem Report.


1. "SoftMaker" - http://www.softmaker.de/index_en.htm
2. "PlanMaker 2004" - http://www.softmaker.de/pm_en.htm
3. "Microsoft Excel 2003" - http://office.microsoft.com/excel
4. "OpenOffice.org Calc" - http://www.openoffice.org/product/calc.html
5. "Gnumeric" - http://www.gnome.org/projects/gnumeric/
6. "charting and graphing capabilities" - http://www.newsforge.com/blob.pl?%0D%0Aid=60767d36c30bb0ecd95578a621a60f9f
7. "a page detailing its Excel compatibility features" - http://www.softmaker.de/pmwcomp_en.htm
8. "a large number of calculation functions" - http://www.newsforge.com/blob.pl?%0D%0Aid=6790fd38804bfd8df376569d5ee2950b
9. "TextMaker" - http://www.softmaker.de/tm_en.htm
10. "SoftMaker" - http://www.softmaker.com/
11. "Click here" - http://www.softmaker.de/pm_en.htm
12. "author" - http://www.herotale.com/
13. "The Jem Report" - http://www.thejemreport.com/

Get Firefox!