[ACCEPTED]-Why is security through obscurity a bad idea?-security-by-obscurity
Security through obscurity would be burying 14 your money under a tree. The only thing 13 that makes it safe is no one knows it's 12 there. Real security is putting it behind 11 a lock or combination, say in a safe. You 10 can put the safe on the street corner because 9 what makes it secure is that no one can 8 get inside it but you.
As mentioned by @ThomasPadron-McCarty 7 below in a comment below:
If someone discovers 6 the password, you can just change the password, which 5 is easy. If someone finds the location, you 4 need to dig up the money and move it somewhere 3 else, which is much more work. And if you 2 use security by obscurity in a program, you 1 would have to rewrite the program.
Security through obscurity can be said to 15 be bad because it often implies that the 14 obscurity is being used as the principal 13 means of security. Obscurity is fine until 12 it is discovered, but once someone has worked 11 out your particular obscurity, then your 10 system is vulnerable again. Given the persistence 9 of attackers, this equates to no security 8 at all.
Obscurity should never be used as 7 an alternative to proper security techniques.
Obscurity 6 as a means of hiding your source code to 5 prevent copying is another subject. I'm 4 rather split on that topic; I can understand 3 why you might wish to do that, personally 2 I've never been in a situation where it 1 would be wanted.
Security through obscurity is an interesting 33 topic. It is (rightly) maligned as a substitute 32 for effective security. A typical principle 31 in cryptography is that a message is unknown 30 but the contents are not. Algorithms for 29 encyrption are typically widely published, analyzed 28 by mathematicians and, after a time, some 27 confidence is built up in their effectivness 26 but there is never a guarantee that they're 25 effective.
Some people hide their cryptographic 24 algorithms but this is considered a dangerous 23 practice because then such algorithms haven't 22 gone through the same scrutiny. Only organisations 21 like the NSA, which has a significant budget 20 and staff of mathematicians, can get away 19 with this kind of approach.
One of the more 18 interesting developments in recent years 17 has been the risk of steganography, which is the practice 16 is hiding message in images, sound files 15 or some other medium. The biggest problem 14 in steganalysis is identifying whether or 13 not a message is there or not, making this 12 security through obscurity.
Last year I came 11 across a story that Researchers Calculate Capacity of a Steganographic Channel but the really interesting 10 thing about this is:
Studying a stego-channel 9 in this way leads to some counter-intuitive results: for 8 example, in certain circumstances, doubling 7 the number of algorithms looking for hidden 6 data can increase the capacity of the steganographic 5 channel.
In other words, the more algorithms 4 you use to identify messages the less effective 3 it becomes, which goes against the normal 2 criticism of security through obscurity.
Interesting 1 stuff.
The main reason it is a bad idea is that 9 it does not FIX the underlying problems, just 8 attempts to hide them. Sooner or later, the 7 problems will be discovered.
Also, extra 6 encryption will incur additional overhead.
Finally 5 excessive obscurity (like using checksums) makes 4 maintenance a nightmare.
Better security 3 alternatives is to eliminate potential weaknesses 2 in your code such as enforced inputs to 1 prevent injection attacks.
One factor the ability to recover from a 3 security breach. If someone discovers your 2 password, just reset it. But if someone 1 uncovers your obscure scheme, you're hosed.
Using obscurity as all these people agree 22 is not security, its buying yourself time. That 21 said having a decent security system implemented 20 then adding an extra layer of obscurity 19 is still useful. Lets say tomorrow someone 18 finds an unbeatable crack/hole in the ssh 17 service that can't be patched immediately.
As 16 a rule I've implemented in house... all 15 public facing servers expose only the ports 14 needed ( http/https ) and nothing more. One 13 public facing server then will have ssh 12 exposed to the internet at some obscure 11 high numbered port and a port scanning trigger 10 setup to block any IP's that try to find 9 it.
Obscurity has its place in the world 8 of security, but not as the first and last 7 line of defense. In the example above, I 6 don't get any script/bot attacks on ssh 5 because they don't want to spend the time 4 searching for a non-standard ssh service 3 port and if they do, their unlikely to find 2 it before another layer of security steps 1 in and cuts them off.
All of the forms of security available are 16 actually forms of security through obscurity. Each 15 method increases in complexity and provides 14 better security but they all rely on some 13 algorithm and one or more keys to restore 12 the encrypted data. "Security through obscurity" as 11 most call it is when someone chooses one 10 of the simplest and easiest to crack algorithms.
Algorithms 9 such as character shifting are easy to implement 8 and easy to crack, that's why they are a 7 bad idea. It's probably better than nothing, but 6 it will, at most, only stop a casual glance 5 at the data from being easily read.
There 4 are excellent resources on the Internet 3 you can use to educate yourself about all 2 of the available encryption methods and 1 their strengths and weaknesses.
Security is about letting people in or keeping 66 them out depending on what they know, who 65 they are, or what they have. Currently, biometrics 64 aren't good at finding who you are, and 63 there's always going to be problems with 62 it (fingerprint readers for somebody who's 61 been in a bad accident, forged fingerprints, etc.). So, actually, much 60 of security is about obfuscating something.
Good 59 security is about keeping the stuff you 58 have to keep secret to a minimum. If you've 57 got a properly encrypted AES channel, you 56 can let the bad guys see everything about 55 it except the password, and you're safe. This 54 means you have a much smaller area open 53 to attack, and can concentrate on securing 52 the passwords. (Not that that's trivial.)
In 51 order to do that, you have to have confidence 50 in everything but the password. This normally 49 means using industry-standard crypto that 48 numerous experts have looked at. Anybody 47 can create a cipher they can't break, but 46 not everybody can make a cipher Bruce Schneier 45 can't break. Since there's a thorough lack 44 of theoretical foundations for cipher security, the 43 security of a cipher is determined by having 42 a lot of very smart and knowledgeable people 41 try to come up with attacks, even if they're 40 not practical (attacks on ciphers always 39 get better, never worse). This means the 38 crypto algorithm needs to be widely known. I 37 have very strong confidence in the Advanced 36 Encryption Standard, and almost none in 35 a proprietary algorithm Joe wrote and obfuscated.
However, there's 34 been problems with implementations of crypto 33 algorithms. It's easy to inadvertantly 32 leave holes whereby the key can be found, or 31 other mischief done. It happened with an 30 alternate signature field for PGP, and weaknesses 29 with SSL implemented on Debian Linux. It's 28 even happened to OpenBSD, which is probably 27 the most secure operating system readily 26 available (I think it's up to two exploits 25 in ten years). Therefore, these should 24 be done by a reputable company, and I'd 23 feel better if the implementations were 22 open source. (Closed source won't stop 21 a determined attacker, but it'll make it 20 harder for random good guys to find holes 19 to be closed.)
Therefore, if I wanted security, I'd 18 try to have my system as reliable as possible, which 17 means as open as possible except for the 16 password.
Layering security by obscurity 15 on top of an already secure system might 14 help some, but if the system's secure it 13 won't be necessary, and if it's insecure 12 the best thing is to make it secure. Think 11 of obscurity like the less reputable forms 10 of "alternative medicine" - it 9 is very unlikely to help much, and while 8 it's unlikely to hurt much by itself it 7 may make the patient less likely to see 6 a competent doctor or computer security 5 specialist, whichever.
Lastly, I'd like to 4 make a completely unsolicited and disinterested 3 plug for Bruce Schneier's blog, as nothing more than an interested 2 reader. I've learned a lot about security 1 from it.
One of the best ways of evaluating, testing 8 or improving a security product is to have 7 it banged on by a large, clever peer group.
Products 6 that rely for their security on being a 5 "black box" can't have the benefit of this 4 kind of test. Of course, being a "black 3 box" always invites the suspicion (often 2 justified) that they wouldn't stand up to 1 that kind of scrutiny anyway.
I argued in one case that password protection 13 is really security through obscurity. The 12 only security I can think of that wouldn't 11 be STO is some sort of biometric security.
Besides 10 that bit of semantics and nit picking, STO 9 (Security through obscurity) is obviously 8 bad in any case where you need real security. However, there 7 might be cases where it doesn't matter. I'll 6 often XOR pad a text file i don't want anyone 5 reading. But I don't really care if they 4 do, i'd just prefer that it not be read. In 3 that case, it doesn't matter, and an XOR 2 pad is a perfect example of an easy to find 1 out STO.
It is almost never a good idea. It is the same to say, is 4 it a good idea to drive without seatbelt? Of 3 course you can find some cases where it 2 fits, but the anwser due to experience seems 1 obvious.
Weak encryption will only deter the least 15 motivated hackers, so it isn't valueless, it 14 just isn't very valuable, especially when 13 strong encryption, like AES, is available.
Security 12 through obscurity is based on the assumption 11 that you are smart and your users are stupid. If 10 that assumption is based on arrogance, and 9 not empirical data, then your users- and 8 hackers-- will determine how to invoke the 7 hidden method, bring up the unlinked page, decompile 6 and extract the plain text password from 5 the .dll, etc.
That said, providing comprehensive 4 meta-data to users is not a good idea, and 3 obscuring is perfectly valid technique as 2 long as you back it up with encryption, authorization, authentication 1 and all those other principles of security.
If the OS is Windows, look at using the 6 Data Protection API (DPAPI). It is not 5 security by obscurity, and is a good way 4 to store login credentials for an unattended 3 process. As pretty much everyone is saying 2 here, security through obscurity doesn't 1 give you much protection.
The one point I have to add which hasn't 9 been touched on yet is the incredible ability 8 of the internet to smash security through 7 obscurity.
As has been shown time and time 6 again, if your only defense is that "nobody 5 knows the back door/bug/exploit is there", then 4 all it takes is for one person to stumble 3 across it and, within minutes, hundreds 2 of people will know. The next day, pretty 1 much everyone who wants to know, will. Ouch.
More Related questions